# Getting Started - Introduction
**Remult** is a fullstack CRUD framework that uses your TypeScript model types to provide:
- Secure REST API (highly configurable)
- Type-safe frontend API client
- Type-safe backend query builder
#### Use the same model classes for both frontend and backend code
With Remult it is simple to keep your code [DRY](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself) and increase development speed and maintainability by defining a single TypeScript model class (for each domain object) and sharing it between your frontend and backend code.
As Remult is "aware" of the runtime context (frontend or backend), data validations and entity lifecycle hooks can be written in layer-agnostic TypeScript which will run, as needed, on either the frontend, the backend, or both.
## Choose Your Remult Learning Path
Explore the flexibility of Remult through different learning paths tailored to match your style and project needs.
### `Option A`: Start with the Interactive Online Tutorial
If you're new to Remult or prefer a guided, hands-on approach, we recommend starting with our [interactive online tutorial](https://learn.remult.dev). This tutorial will walk you through building a full-stack application step by step, providing immediate feedback and insights as you learn.
### `Option B`: Create a new Project
[`npm init remult@latest`](./creating-a-project.md)
### `Option C`: Follow a Step-by-step Tutorial
### `Option D`: Quickstart
Use this [Quickstart](./quickstart.md) guide to quickly setup and try out Remult or add Remult to an existing app.
### `Option E`: Browse Example Apps
[Example Apps](./example-apps.md)
### `Option F`: Video Tutorials
Check out these official [Remult video tutorials](https://youtube.com/playlist?list=PLlcnBwFkuOn166nXXxxfL9Hee-1GWlDSm&si=TDlwIFDLi4VMi-as).
# Getting Started - Creating a project
# Creating a Remult Project
_The easiest way to start building a Remult app_
```bash
npm init remult@latest
```
Yes, that's it!
::: tip
Let us know how you liked the process! [@remultjs](https://twitter.com/RemultJs)
:::
## Demo
![npm init remult](../public/npm_init_remult.gif)
## What you get ?
### 1. **Tailored Setup**
Answer a few questions about your preferred tech stack and project requirements.
`Project name`: The name of your project _(it will create a folder with this name)_
`Choose your Framework`
`Choose your Web Server` _(if needed)_
`Choose your Database`
`Authentication`: Do you want to add `auth.js` to your project directly ? including a complete implementation for `credentials` and `github` providers
`Add CRUD demo`: A comprehensive example of how to use an entity. It will show you how to create, read, update and delete data.
`Admin UI`: Will then be available at `/api/admin`
### 2. **Instant Configuration**
Based on your answers, Remult will configure the project with the best-suited options. With all combinations of frameworks, servers, databases and authentication, we manage more than `180 different project flavors`! We are missing yours? Let us know !
### 3. **Feature-Rich Demo**
Once you run your project, you'll be greeted with a comprehensive dashboard that showcases all of Remult's powerful features. It will look like this:
![Remult Dashboard](/create-remult.png)
Each tile is a fully functional example of a feature that you selected.
### 4. **Easy Eject**
Simply remove the demo folder to eject the demo components.
# Getting Started - Quickstart
# Quickstart
Jumpstart your development with this Quickstart guide. Learn to seamlessly integrate Remult in various stacks, from installation to defining entities for efficient data querying and manipulation.
### Experience Remult with an Interactive Tutorial
For a guided, hands-on experience, [try our interactive online tutorial](https://learn.remult.dev/). It's the fastest way to get up and running with Remult and understand its powerful features.
## Installation
The _remult_ package is all you need for both frontend and backend code. If you're using one `package.json` for both frontend and backend (or a meta-framework) - **install Remult once** in the project's root folder. If you're using multiple `package.json` files (monorepo) - **install Remult in both server and client folders**.
::: code-group
```sh [npm]
npm install remult
```
```sh [yarn]
yarn add remult
```
```sh [pnpm]
pnpm add remult
```
```sh [bun]
bun add remult
```
:::
## Server-side Initialization
Remult is initialized on the server-side as a request handling middleware, with **a single line of code**. Here is the code for setting up the Remult middleware:
::: code-group
```ts [Express]
import express from 'express'
import { remultExpress } from 'remult/remult-express'
const app = express()
app.use(remultExpress({})) // [!code highlight]
app.listen(3000)
```
```ts [Fastify]
import fastify from 'fastify'
import { remultFastify } from 'remult/remult-fastify'
(async () => {
const server = fastify()
await server.register(remultFastify({})) // [!code highlight]
server.listen({ port: 3000 })
})()
```
```ts [Next.js]
// src/app/api/[...remult]/route.ts
import { remultNextApp } from 'remult/remult-next'
export const api = remultNextApp({}) // [!code highlight]
export const { GET, POST, PUT, DELETE } = api
```
```ts [Sveltekit]
// src/routes/api/[...remult]/+server.ts
import { remultSveltekit } from 'remult/remult-sveltekit'
export const _api = remultSveltekit({}) // [!code highlight]
export const { GET, POST, PUT, DELETE } = _api
```
```ts [nuxt.js]
// server/api/[...remult].ts
import { remultNuxt } from 'remult/remult-nuxt'
export const api = remultNuxt({})
export default defineEventHandler(api)
// enable experimental decorators
// Add to nuxt.config.ts
nitro: {
esbuild: {
options: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
},
},
vite: {
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
},
```
```ts [Hapi]
import { type Plugin, server } from '@hapi/hapi'
import { remultHapi } from 'remult/remult-hapi'
(async () => {
const hapi = server({ port: 3000 })
await hapi.register(remultHapi({})) // [!code highlight]
hapi.start()
})()
```
```ts [Hono]
import { Hono } from 'hono'
import { serve } from '@hono/node-server'
import { remultHono } from 'remult/remult-hono'
const app = new Hono()
const api = remultHono({}) // [!code highlight]
app.route('', api) // [!code highlight]
serve(app)
```
```ts [Nest]
// src/main.ts
import { remultExpress } from 'remult/remult-express'
async function bootstrap() {
const app = await NestFactory.create(AppModule)
app.use(remultExpress({})) // [!code highlight]
await app.listen(3000)
}
bootstrap()
```
```ts{9-17} [Koa]
import * as koa from 'koa'
import * as bodyParser from 'koa-bodyparser'
import { createRemultServer } from 'remult/server'
const app = new koa()
app.use(bodyParser())
const api = createRemultServer({})
app.use(async (ctx, next) => {
const r = await api.handle(ctx.request)
if (r) {
ctx.response.body = r.data
ctx.response.status = r.statusCode
} else return await next()
})
app.listen(3000, () => {})
```
:::
## Connecting a Database
Use the `dataProvider` property of Remult's server middleware to set up a database connection for Remult.
::: tip Recommended - Use default local JSON files and connect a database later
If the `dataProvider` property is not set, Remult stores data as JSON files under the `./db` folder.
:::
Here are examples of connecting to some commonly used back-end databases:
::: tabs
== Postgres
Install node-postgres:
```sh
npm i pg
```
Set the `dataProvider` property:
```ts{3,7,11-15}
import express from "express"
import { remultExpress } from "remult/remult-express"
import { createPostgresDataProvider } from "remult/postgres"
const app = express()
const connectionString = "postgres://user:password@host:5432/database"
app.use(
remultExpress({
dataProvider:
createPostgresDataProvider({
connectionString, // default: process.env["DATABASE_URL"]
// configuration: {} // optional = a `pg.PoolConfig` object or "heroku"
})
})
)
```
Or use your existing postgres connection
```ts
import { Pool } from 'pg'
import { SqlDatabase } from 'remult'
import { PostgresDataProvider } from 'remult/postgres'
import { remultExpress } from 'remult/remult-express'
const pg = new Pool({
connectionString: '....',
})
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase(new PostgresDataProvider(pg)),
}),
)
```
== MySQL
Install knex and mysql2:
```sh
npm i knex mysql2
```
Set the `dataProvider` property:
```ts{3,9-18}
import express from "express"
import { remultExpress } from "remult/remult-express"
import { createKnexDataProvider } from "remult/remult-knex"
const app = express()
app.use(
remultExpress({
dataProvider: createKnexDataProvider({
// Knex client configuration for MySQL
client: "mysql2",
connection: {
user: "your_database_user",
password: "your_database_password",
host: "127.0.0.1",
database: "test"
}
})
})
)
```
Or use your existing knex provider
```ts
import express from 'express'
import { KnexDataProvider } from 'remult/remult-knex'
import { remultExpress } from 'remult/remult-express'
import knex from 'knex'
const knexDb = knex({
client: '...',
connection: '...',
})
const app = express()
app.use(
remultExpress({
dataProvider: new KnexDataProvider(knexDb), // [!code highlight]
}),
)
```
== MongoDB
Install mongodb:
```sh
npm i mongodb
```
Set the `dataProvider` property:
```ts{3-4,10-14}
import express from "express"
import { remultExpress } from "remult/remult-express"
import { MongoClient } from "mongodb"
import { MongoDataProvider } from "remult/remult-mongo"
const app = express()
app.use(
remultExpress({
dataProvider: async () => {
const client = new MongoClient("mongodb://localhost:27017/local")
await client.connect()
return new MongoDataProvider(client.db("test"), client)
}
})
)
```
== SQLite
There are several sqlite providers supported
### Better-sqlite3
Install better-sqlite3:
```sh
npm i better-sqlite3
```
Set the `dataProvider` property:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult' // [!code highlight]
import Database from 'better-sqlite3' // [!code highlight]
import { BetterSqlite3DataProvider } from 'remult/remult-better-sqlite3' // [!code highlight]
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase( // [!code highlight]
new BetterSqlite3DataProvider(new Database('./mydb.sqlite')), // [!code highlight]
), // [!code highlight]
}),
)
```
### sqlite3
This version of sqlite3 works even on stackblitz
Install sqlite3:
```sh
npm i sqlite3
```
Set the `dataProvider` property:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult' // [!code highlight]
import sqlite3 from 'sqlite3' // [!code highlight]
import { Sqlite3DataProvider } from 'remult/remult-sqlite3' // [!code highlight]
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase( // [!code highlight]
new Sqlite3DataProvider(new sqlite3.Database('./mydb.sqlite')), // [!code highlight]
), // [!code highlight]
}),
)
```
### bun:sqlite
Set the `dataProvider` property:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult' // [!code highlight]
import { Database } from 'bun:sqlite' // [!code highlight]
import { BunSqliteDataProvider } from 'remult/remult-bun-sqlite' // [!code highlight]
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase( // [!code highlight]
new BunSqliteDataProvider(new Database('./mydb.sqlite')), // [!code highlight]
), // [!code highlight]
}),
)
```
### sql.js
Install sqlite3:
```sh
npm i sql.js
```
Set the `dataProvider` property:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult' // [!code highlight]
import initSqlJs from 'sql.js' // [!code highlight]
import { SqlJsDataProvider } from 'remult/remult-sql-js' // [!code highlight]
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase( // [!code highlight]
new SqlJsDataProvider(initSqlJs().then((x) => new x.Database())), // [!code highlight]
), // [!code highlight]
}),
)
```
### Turso
Install turso:
```sh
npm install @libsql/client
```
Set the `dataProvider` property:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult' // [!code highlight]
import { createClient } from '@libsql/client' // [!code highlight]
import { TursoDataProvider } from 'remult/remult-turso' // [!code highlight]
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase( // [!code highlight]
new TursoDataProvider( // [!code highlight]
createClient({ // [!code highlight]
url: process.env.TURSO_DATABASE_URL, // [!code highlight]
authToken: process.env.TURSO_AUTH_TOKEN, // [!code highlight]
}), // [!code highlight]
), // [!code highlight]
), // [!code highlight]
}),
)
```
== Microsoft SQL Server
Install knex and tedious:
```sh
npm i knex tedious
```
Set the `dataProvider` property:
```ts{5,11-25}
// index.ts
import express from "express"
import { remultExpress } from "remult/remult-express"
import { createKnexDataProvider } from "remult/remult-knex"
const app = express()
app.use(
remultExpress({
dataProvider: createKnexDataProvider({
// Knex client configuration for MSSQL
client: "mssql",
connection: {
server: "127.0.0.1",
database: "test",
user: "your_database_user",
password: "your_database_password",
options: {
enableArithAbort: true,
encrypt: false,
instanceName: `sqlexpress`
}
}
})
})
)
```
Or use your existing knex provider
```ts
import express from 'express'
import { KnexDataProvider } from 'remult/remult-knex'
import { remultExpress } from 'remult/remult-express'
import knex from 'knex'
const knexDb = knex({
client: '...',
connection: '...',
})
const app = express()
app.use(
remultExpress({
dataProvider: new KnexDataProvider(knexDb), // [!code highlight]
}),
)
```
== DuckDB
Install DuckDB:
```sh
npm i duckdb
```
Set the `dataProvider` property:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult' // [!code highlight]
import { Database } from 'duckdb' // [!code highlight]
import { DuckDBDataProvider } from 'remult/remult-duckdb' // [!code highlight]
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase( // [!code highlight]
new DuckDBDataProvider(new Database(':memory:')), // [!code highlight]
), // [!code highlight]
}),
)
```
== Oracle
Install knex and oracledb:
```sh
npm i knex oracledb
```
Set the `dataProvider` property:
```ts{5,11-19}
// index.ts
import express from "express"
import { remultExpress } from "remult/remult-express"
import { createKnexDataProvider } from "remult/remult-knex"
const app = express()
app.use(
remultExpress({
dataProvider: createKnexDataProvider({
// Knex client configuration for Oracle
client: "oracledb",
connection: {
user: "your_database_user",
password: "your_database_password",
connectString: "SERVER"
}
})
})
)
```
Or use your existing knex provider
```ts
import express from 'express'
import { KnexDataProvider } from 'remult/remult-knex'
import { remultExpress } from 'remult/remult-express'
import knex from 'knex'
const knexDb = knex({
client: '...',
connection: '...',
})
const app = express()
app.use(
remultExpress({
dataProvider: new KnexDataProvider(knexDb), // [!code highlight]
}),
)
```
== JSON Files
Set the `dataProvider` property:
```ts{5-6,12-14}
// index.ts
import express from "express"
import { remultExpress } from "remult/remult-express"
import { JsonDataProvider } from "remult"
import { JsonEntityFileStorage } from "remult/server"
const app = express()
app.use(
remultExpress({
dataProvider: async () =>
new JsonDataProvider(new JsonEntityFileStorage("./db"))
})
)
```
:::
## Integrate Auth
**Remult is completely unopinionated when it comes to user authentication.** You are free to use any kind of authentication mechanism, and only required to provide Remult with a [`getUser`](./ref_remultserveroptions.md#getuser) function that extracts a user object (which implements the minimal Remult `UserInfo` interface) from a request.
Here are examples of integrating some commonly used auth providers:
::: code-group
```ts [express-session]
import express from 'express'
import session from 'express-session'
import { remultExpress } from 'remult/remult-express'
const app = express()
app.use(
session({
/* ... */
}),
)
app.post('/api/signIn', (req, res) => {
req.session!['user'] = { id: 1, name: 'admin', roles: ['admin'] }
})
app.use(
remultExpress({
getUser: (req) => req.session!['user'], // [!code highlight]
}),
)
```
```ts{8-13} [next-auth]
// src/app/api/[...remult]/route.ts
import { remultNextApp } from 'remult/remult-next'
import { getServerSession } from 'next-auth'
import { authOptions } from '../auth/[...nextauth]/route'
export const api = remultNextApp({
getUser: async () => {
const user = (await getServerSession(authOptions))?.user
return user?.email && user?.name
? { id: user?.email, name: user?.name }
: undefined
},
})
export const { POST, PUT, DELETE, GET, withRemult } = api
```
:::
## Defining and Serving an Entity
Remult entity classes are shared between frontend and backend code.
```ts
// shared/product.ts
import { Entity, Fields } from 'remult'
@Entity('products', {
allowApiCrud: true,
allowApiDelete: 'admin',
})
export class Product {
@Fields.uuid()
id!: string
@Fields.string()
name = ''
@Fields.number()
unitPrice = 0
}
```
Alternatively, [generate entities](./entities-codegen-from-db-schema.md) from an existing Postgres database.
### Serve Entity CRUD API
All Remult server middleware options contain an [`entities`](./ref_remultserveroptions.md#entities) array. Use it to register your Entity.
```ts
// backend/index.ts
app.use(
remultExpress({
entities: [Product], // [!code highlight]
}),
)
```
## Using your Entity on the Client
To start querying and mutating data from the client-side using Remult, use the [`remult.repo`](./ref_remult.md#repo) function to create a [`Repository`](./ref_repository.md) object for your entity class. This approach simplifies data operations, allowing you to interact with your backend with the assurance of type safety.
```ts
// frontend/code.ts
import { remult } from 'remult'
import { Product } from '../shared/product'
const productsRepo = remult.repo(Product)
async function playWithRemult() {
// add a new product to the backend database
await productsRepo.insert({ name: 'Tofu', unitPrice: 5 })
// fetch products from backend database
const products = await productsRepo.find({
where: { unitPrice: { '>=': 5 } },
orderBy: { name: 'asc' },
limit: 10,
})
console.log(products)
// update product data
const tofu = products.filter((p) => p.name === 'Tofu')
await productsRepo.save({ ...tofu, unitPrice: tofu.unitPrice + 5 })
// delete product
await productsRepo.delete(tofu)
}
playWithRemult()
```
## Client-side Customization
::: tip Recommended Defaults
By default, remult uses the browser's [fetch API](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API), and makes data API calls using the base URL `/api` (same-origin).
:::
### Changing the default API base URL
To use a different origin or base URL for API calls, set the remult object's `apiClient.url` property.
```ts
remult.apiClient.url = 'http://localhost:3002/api'
```
### Using an alternative HTTP client
Set the `remult` object's `apiClient.httpClient` property to customize the HTTP client used by Remult:
::: code-group
```ts [Axios instead of Fetch]
import axios from 'axios'
import { remult } from 'remult'
remult.apiClient.httpClient = axios
```
```ts [Angular HttpClient instead of Fetch]
//...
import { HttpClientModule, HttpClient } from '@angular/common/http'
import { remult } from 'remult'
@NgModule({
//...
imports: [
//...
HttpClientModule,
],
})
export class AppModule {
constructor(http: HttpClient) {
remult.apiClient.httpClient = http
}
}
```
:::
# Getting Started - Example Apps
# Example Apps
We have already a _ton_ of examples! Pick and choose the one that fits your needs 😊
## Todo MVC
## CRM Demo
A fully featured CRM! Make sure to check out the link: Dev / Admin on top right!
## Shadcn React Table
Using remult with server side sorting, filtering, paging & CRUD
## TanStack React Table
Example of using remult with react table - most basic design, with server side sorting, paging & filtering
## 🚀 Ready to play
An environment to reproduce issues using stackblitz, with optional sqlite database
## Group by Example
And example of the usage of groupBy
## Todo for most frameworks
- [React & Express](https://github.com/remult/remult/tree/main/examples/react-todo)
- [React & bun & Hono](https://github.com/remult/remult/tree/main/examples/bun-react-hono-monorepo-todo)
- [Next.js (App Router)](https://github.com/remult/remult/tree/main/examples/nextjs-app-router-todo)
- [Next.js (Pages)](https://github.com/remult/remult/tree/main/examples/nextjs-todo)
- [Angular & Express](https://github.com/remult/remult/tree/main/examples/angular-todo)
- [Angular & Fastify](https://github.com/remult/remult/tree/main/examples/angular-todo-fastify)
- [Vue](https://github.com/remult/remult/tree/main/examples/vue-todo)
- [Nuxt3](https://github.com/remult/remult/tree/main/examples/nuxt-todo)
- [SvelteKit](https://github.com/remult/remult/tree/main/examples/sveltekit-todo)
- [SolidStart](https://github.com/remult/remult/tree/main/examples/solid-start-todo)
## Other example
- [Using BackendMethod queued option](https://stackblitz.com/edit/github-vwfkxu?file=src%2FApp.tsx)
- [Using SubscriptionChannel to update the frontend](https://stackblitz.com/edit/github-3nmwrp?file=src%2FApp.tsx)
- [Next.js Auth with remult user table](https://github.com/noam-honig/nextjs-auth-remult-user-table)
- [Unit tests for api](https://stackblitz.com/edit/api-test-example?file=test.spec.ts,model.ts)
- [Extending field options with open api specific properties](https://github.com/noam-honig/adding-open-api-options/)
# Entities - Fields
# Field Types
## Common field types
There are also several built in Field decorators for common use case:
### @Fields.string
A field of type string
```ts
@Fields.string()
title = '';
```
### @Fields.number
Just like TypeScript, by default any number is a decimal (or float).
```ts
@Fields.number()
price = 1.5
```
### @Fields.integer
For cases where you don't want to have decimal values, you can use the `@Fields.integer` decorator
```ts
@Fields.integer()
quantity = 0;
```
### @Fields.boolean
```ts
@Fields.boolean()
completed = false
```
### @Fields.date
```ts
@Fields.date()
statusDate = new Date()
```
### @Fields.dateOnly
Just like TypeScript, by default any `Date` field includes the time as well.
For cases where you only want a date, and don't want to meddle with time and time zone issues, use the `@Fields.dateOnly`
```ts
@Fields.dateOnly()
birthDate?:Date;
```
### @Fields.createdAt
Automatically set on the backend on insert, and can't be set through the API
```ts
@Fields.createdAt()
createdAt = new Date()
```
### @Fields.updatedAt
Automatically set on the backend on update, and can't be set through the API
```ts
@Fields.updatedAt()
updatedAt = new Date()
```
## JSON Field
You can store JSON data and arrays in fields.
```ts
@Fields.json()
tags: string[] = []
```
## Auto Generated Id Field Types
### @Fields.uuid
This id value is determined on the backend on insert, and can't be updated through the API.
```ts
@Fields.uuid()
id:string
```
### @Fields.cuid
This id value is determined on the backend on insert, and can't be updated through the API.
Uses the [@paralleldrive/cuid2](https://www.npmjs.com/package/@paralleldrive/cuid2) package
```ts
@Fields.cuid()
id:string
```
### @Fields.autoIncrement
This id value is determined by the underlying database on insert, and can't be updated through the API.
```ts
@Fields.autoIncrement()
id:number
```
### MongoDB ObjectId Field
To indicate that a field is of type object id, change it's `fieldTypeInDb` to `dbid`.
```ts
@Fields.string({
dbName: '_id',
valueConverter: {
fieldTypeInDb: 'dbid',
},
})
id: string = ''
```
## Enum Field
Enum fields allow you to define a field that can only hold values from a specific enumeration. The `@Fields.enum` decorator is used to specify that a field is an enum type. When using the `@Fields.enum` decorator, an automatic validation is added that checks if the value is valid in the specified enum.
```ts
@Fields.enum(() => Priority)
priority = Priority.Low;
```
In this example, the `priority` field is defined as an enum type using the `@Fields.enum` decorator. The `Priority` enum is passed as an argument to the decorator, ensuring that only valid `Priority` enum values can be assigned to the `priority` field. The `Validators.enum` validation is used and ensures that any value assigned to this field must be a member of the `Priority` enum, providing type safety and preventing invalid values.
## Literal Fields (Union of string values)
Literal fields let you restrict a field to a specific set of string values using the `@Fields.literal` decorator. This is useful for fields with a finite set of possible values.
```ts
@Fields.literal(() => ['open', 'closed', 'frozen', 'in progress'] as const)
status: 'open' | 'closed' | 'frozen' | 'in progress' = 'open';
```
In this example, we use the `as const` assertion to ensure that the array `['open', 'closed', 'frozen', 'in progress']` is treated as a readonly array, which allows TypeScript to infer the literal types 'open', 'closed', 'frozen', and 'in progress' for the elements of the array. This is important for the type safety of the `status` field.
The `status` field is typed as `'open' | 'closed' | 'frozen' | 'in progress'`, which means it can only hold one of these string literals. The `@Fields.literal` decorator is used to specify that the `status` field can hold values from this set of strings, and it uses the `Validators.in` validator to ensure that the value of `status` matches one of the allowed values.
For better reusability and maintainability, and to follow the DRY (Don't Repeat Yourself) principle, it is recommended to refactor the literal type and the array of allowed values into separate declarations:
```ts
const statuses = ['open', 'closed', 'frozen', 'in progress'] as const;
type StatusType = typeof statuses[number];
@Fields.literal(() => statuses)
status: StatusType = 'open';
```
In this refactored example, `statuses` is a readonly array of the allowed values, and `StatusType` is a type derived from the elements of `statuses`. The `@Fields.literal` decorator is then used with the `statuses` array, and the `status` field is typed as `StatusType`. This approach makes it easier to manage and update the allowed values for the `status` field, reducing duplication and making the code more robust and easier to maintain.
## ValueListFieldType
### Overview
The `ValueListFieldType` is useful in cases where simple enums and unions are not enough, such as when you want to have more properties for each value. For example, consider representing countries where you want to have a country code, description, currency, and international phone prefix.
### Defining a ValueListFieldType
Using enums or union types for this purpose can be challenging. Instead, you can use the `ValueListFieldType`:
```ts
@ValueListFieldType()
export class Country {
static us = new Country('us', 'United States', 'USD', '1')
static canada = new Country('ca', 'Canada', 'CAD', '1')
static france = new Country('fr', 'France', 'EUR', '33')
constructor(
public id: string,
public caption: string,
public currency: string,
public phonePrefix: string,
) {}
}
```
### Using in an Entity
In your entity, you can define the field as follows:
```ts
@Field(() => Country)
country: Country = Country.us;
```
### Accessing Properties
The property called `id` will be stored in the database and used through the API, while in the code itself, you can use each property:
```ts
call('+' + person.country.phonePrefix + person.phone)
```
Note: Only the `id` property is saved in the database and used in the API. Other properties, such as `caption`, `currency`, and `phonePrefix`, are only accessible in the code and are not persisted in the database.
### Getting Optional Values
To get the optional values for `Country`, you can use the `getValueList` function, which is useful for populating combo boxes:
```ts
console.table(getValueList(Country))
```
### Special Properties: id and caption
The `id` and `caption` properties are special in that the `id` will be used to save and load from the database, and the `caption` will be used as the display value.
### Automatic Generation of id and caption
If `id` and/or `caption` are not provided, they are automatically generated based on the static member name. For example:
```ts
@ValueListFieldType()
export class TaskStatus {
static open = new TaskStatus() // { id: 'open', caption: 'Open' }
static closed = new TaskStatus() // { id: 'closed', caption: 'Closed' }
id!: string
caption!: string
constructor() {}
}
```
In this case, the `open` member will have an `id` of `'open'` and a `caption` of `'Open'`, and similarly for the `closed` member.
### Handling Partial Lists of Values
In cases where you only want to generate members for a subset of values, you can use the `getValues` option of `@ValueListFieldType` to specify which values should be included:
```ts
@ValueListFieldType({
getValues: () => [
Country.us,
Country.canada,
Country.france,
{ id: 'uk', caption: 'United Kingdom', currency: 'GBP', phonePrefix: '44' }
]
})
```
This approach is useful when you want to limit the options available for a field to a specific subset of values, without needing to define all possible values as static members.
::: warning Warning: TypeScript may throw an error similar to `Uncaught TypeError: Currency_1 is not a constructor`.
This happens in TypeScript versions <5.1.6 and target es2022. It's a TypeScript bug. To fix it, upgrade to version >=5.1.6 or change the target from es2022. Alternatively, you can call the `ValueListFieldType` decorator as a function after the type:
```ts
export class TaskStatus {
static open = new TaskStatus()
static closed = new TaskStatus()
id!: string
caption!: string
constructor() {}
}
ValueListFieldType()(TaskStatus)
```
:::
### Summary
The `ValueListFieldType` enables the creation of more complex value lists that provide greater flexibility and functionality for your application's needs beyond what enums and unions can offer. By allowing for additional properties and partial lists of values, it offers a versatile solution for representing and managing data with multiple attributes.
## Control Field Type in Database
In some cases, you may want to explicitly specify the type of a field in the database. This can be useful when you need to ensure a specific data type or precision for your field. To control the field type in the database, you can use the `fieldTypeInDb` option within the `valueConverter` property of a field decorator.
For example, if you want to ensure that a numeric field is stored as a decimal with specific precision in the database, you can specify the `fieldTypeInDb` as follows:
```ts
@Fields.number({
valueConverter: {
fieldTypeInDb: 'decimal(16,8)'
}
})
price=0;
```
In this example, the `price` field will be stored as a `decimal` with 16 digits in total and 8 digits after the decimal point in the database. This allows you to control the storage format and precision of numeric fields in your database schema.
## Creating Custom Field Types
Sometimes, you may need to create custom field types to handle specific requirements or use cases in your application. By creating custom field types, you can encapsulate the logic for generating, validating, and converting field values.
### Example: Creating a Custom ID Field Type with NanoID
NanoID is a tiny, secure, URL-friendly, unique string ID generator. You can create a custom field type using NanoID to generate unique IDs for your entities. Here's an example of how to create a custom NanoID field type:
```typescript
import { nanoid } from 'nanoid'
import { Fields, type FieldOptions } from 'remult'
export function NanoIdField(
...options: FieldOptions[]
) {
return Fields.string(
{
allowApiUpdate: false, // Disallow updating the ID through the API
defaultValue: () => nanoid(), // Generate a new NanoID as the default value
saving: (_, record) => {
if (!record.value) {
record.value = nanoid() // Generate a new NanoID if the value is not set
}
},
},
...options,
)
}
```
In this example, the `NanoIdField` function creates a custom field type based on the `Fields.string` type. It uses the `nanoid` function to generate a unique ID as the default value and ensures that the ID is generated before saving the record if it hasn't been set yet. This custom field type can be used in your entities to automatically generate and assign unique IDs using NanoID.
## Customize DB Value Conversions
Sometimes you want to control how data is saved to the db, or the dto object.
You can do that using the `valueConverter` option.
For example, the following code will save the `tags` as a comma separated string in the db.
```ts
@Fields.object({
valueConverter: {
toDb: x => (x ? x.join(",") : undefined),
fromDb: x => (x ? x.split(",") : undefined)
}
})
tags: string[] = []
```
You can also refactor it to create your own FieldType
```ts
import { Field, FieldOptions, Remult } from 'remult'
export function CommaSeparatedStringArrayField(
...options: (
| FieldOptions
| ((options: FieldOptions, remult: Remult) => void)
)[]
) {
return Fields.object(
{
valueConverter: {
toDb: (x) => (x ? x.join(',') : undefined),
fromDb: (x) => (x ? x.split(',') : undefined),
},
},
...options,
)
}
```
And then use it:
```ts{9}
@CommaSeparatedStringArrayField()
tags: string[] = []
```
There are several ready made valueConverters included in the `remult` package, which can be found in `remult/valueConverters`
## Class Fields
Sometimes you may want a field type to be a class, you can do that, you just need to provide an implementation for its transition from and to JSON.
For example:
```ts
export class Phone {
constructor(public phone: string) {}
call() {
window.open('tel:' + this.phone)
}
}
@Entity('contacts')
export class Contact {
//...
@Field(() => Phone, {
valueConverter: {
fromJson: (x) => (x ? new Phone(x) : undefined!),
toJson: (x) => (x ? x.phone : undefined!),
},
})
phone?: Phone
}
```
Alternatively you can decorate the `Phone` class with the `FieldType` decorator, so that whenever you use it, its `valueConverter` will be used.
```ts
@FieldType({
valueConverter: {
fromJson: (x) => (x ? new Phone(x) : undefined!),
toJson: (x) => (x ? x.phone : undefined!),
},
})
export class Phone {
constructor(public phone: string) {}
call() {
window.open('tel:' + this.phone)
}
}
@Entity('contacts')
export class Contact {
//...
@Field(() => Phone)
phone?: Phone
}
```
# Entities - Relations 🚀
---
outline: [2, 3]
---
# Relations Between Entities
::: tip **Interactive Learning Available! 🚀**
Looking to get hands-on with this topic? Try out our new [**interactive tutorial**](https://learn.remult.dev/in-depth/1-relations/1-many-to-one) on Relations, where you can explore and practice directly in the browser. This guided experience offers step-by-step lessons to help you master relations in Remult with practical examples and exercises.
[Click here to dive into the interactive tutorial on Relations!](https://learn.remult.dev/in-depth/1-relations/1-many-to-one)
:::
### Understanding Entity Relations in Remult
In Remult, entity relations play a useful role in modeling and navigating the complex relationships that exist within your data. To illustrate this concept, we will use two primary entities: `Customer` and `Order`. These entities will serve as the foundation for discussing various types of relations and how to define and work with them .
To experiment with these entities online, you can access the following CodeSandbox link, which is preconfigured with these two entities and a postgres database:
[CodeSandbox - Remult Entity Relations Example](https://codesandbox.io/p/devbox/remult-postgres-demo-f934f8)
Feel free to explore and experiment with the provided entities and their relations in the CodeSandbox environment.
#### Customer Entity
```typescript
// customer.ts
import { Entity, Fields } from 'remult'
@Entity('customers')
export class Customer {
@Fields.cuid()
id = ''
@Fields.string()
name = ''
@Fields.string()
city = ''
}
```
The `Customer` entity represents individuals or organizations with attributes such as an ID, name, and city. Each customer can be uniquely identified by their `id`.
#### Order Entity
```typescript
// order.ts
import { Entity, Fields } from 'remult'
@Entity('orders')
export class Order {
@Fields.cuid()
id = ''
@Fields.string()
customer = ''
@Fields.number()
amount = 0
}
```
The `Order` entity represents transactions or purchases made by customers. Each order is associated with a `customer`, representing the customer who placed the order, and has an `amount` attribute indicating the total purchase amount.
Throughout the following discussion, we will explore how to define and use relations between these entities, enabling you to create sophisticated data models and efficiently query and manipulate data using Remult. Whether you are dealing with one-to-one, one-to-many, or many-to-many relationships, understanding entity relations is essential for building robust and feature-rich applications with Remult.
## Simple Many-to-One
In Remult, many-to-one relations allow you to establish connections between entities, where multiple records of one entity are associated with a single record in another entity. Let's delve into a common use case of a many-to-one relation, specifically the relationship between the `Order` and `Customer` entities.
### Defining the Relation
To establish a many-to-one relation from the `Order` entity to the `Customer` entity, you can use the `@Relations.toOne()` decorator in your entity definition:
```typescript
// order.ts
import { Entity, Fields, Relations } from 'remult'
import { Customer } from '../customer.js'
@Entity('orders')
export class Order {
@Fields.cuid()
id = ''
@Fields.string() // [!code --]
customer = '' // [!code --]
@Relations.toOne(() => Customer) // [!code ++]
customer?: Customer // [!code ++]
@Fields.number()
amount = 0
}
```
In this example, each `Order` is associated with a single `Customer`. The `customer` property in the `Order` entity represents this relationship.
### Fetching Relational Data
When querying data that involves a many-to-one relation, you can use the `include` option to specify which related entity you want to include in the result set. In this case, we want to include the associated `Customer` when querying `Order` records.
Here's how you can include the relation in a query using Remult:
```typescript{3-5}
const orderRepo = remult.repo(Order)
const orders = await orderRepo.find({
include: {
customer: true,
},
})
```
#### Resulting Data Structure
The result of the query will contain the related `Customer` information within each `Order` record, creating a nested structure.
Here's an example result of running `JSON.stringify` on the `orders` array:
```json
[
{
"id": "adjkzsio3efees8ew0wnsqma",
"customer": {
"id": "m4ozs74onwwroav3o1xs1qi8",
"name": "Larkin - Fadel",
"city": "London"
},
"amount": 90
},
{
"id": "gefhsed1clknmogcgiigo9jo",
"customer": {
"id": "m4ozs74onwwroav3o1xs1qi8",
"name": "Larkin - Fadel",
"city": "London"
},
"amount": 3
}
]
```
As shown in the result, each `Order` object contains a nested `customer` object, which holds the details of the associated customer, including their `id`, `name`, and `city`. This structured data allows you to work seamlessly with the many-to-one relationship between `Order` and `Customer` entities .
### Querying a Single Item
To retrieve a single `Order` item along with its associated `Customer`, you can use the `findFirst` method provided by your repository (`orderRepo` in this case). Here's an example of how to perform this query:
```typescript
const singleOrder = await orderRepo.findFirst(
{
id: 'adjkzsio3efees8ew0wnsqma',
},
{
include: {
customer: true,
},
},
)
```
### Relation Loading
In Remult, by default, a relation is not loaded unless explicitly specified in the `include` statement of a query. This behavior ensures that you only load the related data you require for a specific task, optimizing performance and minimizing unnecessary data retrieval.
Here's an example:
```typescript
const orderRepo = remult.repo(Order)
// Query without including the 'customer' relation
const ordersWithoutCustomer = await orderRepo.find({})
```
In the above query, the `customer` relation will not be loaded and have the value of `undefined` because it is not specified in the `include` statement.
#### Overriding Default Behavior with `defaultIncluded`
Sometimes, you may have scenarios where you want a relation to be included by default in most queries, but you also want the flexibility to exclude it in specific cases. Remult allows you to control this behavior by using the `defaultIncluded` setting in the relation definition.
```typescript
@Relations.toOne(() => Customer, {
defaultIncluded: true, // [!code ++]
})
customer = "";
```
In this example, we set `defaultIncluded` to `true` for the `customer` relation in the `Order` entity. This means that, by default, the `customer` relation will be loaded in most queries unless explicitly excluded.
#### Example: Excluding `customer` Relation in a Specific Query
```typescript
const orders = await orderRepo.find({
include: {
customer: false, // [!code ++]
},
})
```
In this query, we override the default behavior by explicitly setting `customer: false` in the `include` statement. This instructs Remult not to load the `customer` relation for this specific query, even though it is set to be included by default.
By combining the default behavior with the ability to override it in specific queries, Remult provides you with fine-grained control over relation loading, ensuring that you can optimize data retrieval based on your application's requirements and performance considerations.
## Advanced Many-to-One
In certain scenarios, you may require more granular control over the behavior of relations and want to access specific related data without loading the entire related entity. Remult provides advanced configuration options to meet these requirements. Let's explore how to achieve this level of control through advanced relation configurations.
### Custom Relation Field
In Remult, you can define custom relation fields that allow you to access the `id` without loading the entire related entity. To define a custom relation field, follow these steps:
#### Step 1: Define a Custom Field in the Entity
In your entity definition, define a custom field that will hold the identifier or key of the related entity. This field serves as a reference to the related entity without loading the entity itself.
```typescript{5-8}
@Entity("orders")
export class Order {
@Fields.cuid()
id = "";
@Fields.string() // [!code ++]
customerId = ""; // Custom field to hold the related entity's identifier // [!code ++]
@Relations.toOne(() => Customer, "customerId") // [!code ++]
@Relations.toOne(() => Customer) // [!code --]
customer?: Customer;
@Fields.number()
amount = 0;
}
```
In this example, we define a custom field called `customerId`, which stores the identifier of the related `Customer` entity.
#### Step 2: Define the Relation Using `toOne`
Use the `@Relations.toOne` decorator to define the relation, specifying the types for the `fromEntity` and `toEntity` in the generic parameters. Additionally, provide the name of the custom field (in this case, `"customerId"`) as the third argument.
```typescript
@Entity('orders')
export class Order {
@Fields.cuid()
id = ''
@Fields.string()
customerId = '' // Custom field to hold the related entity's identifier
@Relations.toOne(() => Customer, 'customerId') // [!code ++]
customer = ''
@Fields.number()
amount = 0
}
```
This configuration establishes a relation between `Order` and `Customer` using the `customerId` field as the reference.
#### Migrating from a Simple `toOne` Relation to a Custom Field Relation with Existing Data
When transitioning from a simple `toOne` relation to a custom field relation in Remult and you already have existing data, it's important to ensure a smooth migration. In this scenario, you need to make sure that the newly introduced custom field (`customerId` in this case) can access the existing data in your database. This is accomplished using the `dbName` option. Here's how to perform this migration:
##### 1. Understand the Existing Data Structure
Before making any changes, it's crucial to understand the structure of your existing data. In the case of a simple `toOne` relation, there may be rows in your database where a field (e.g., `customer`) holds the identifier of the related entity.
##### 2. Define the Custom Field with `dbName`
When defining the custom field in your entity, use the `dbName` option to specify the name of the database column where the related entity's identifier is stored. This ensures that the custom field (`customerId` in this example) correctly accesses the existing data in your database.
```typescript
@Entity('orders')
export class Order {
@Fields.cuid()
id = ''
@Fields.string({ dbName: 'customer' }) // Use dbName to match existing data // [!code ++]
customerId = ''
@Relations.toOne(() => Customer, 'customerId')
customer?: Customer
@Fields.number()
amount = 0
}
```
In this example, we use the `dbName` option to specify that the `customerId` field corresponds to the `customer` column in the database. This mapping ensures that the custom field can access the existing data that uses the `customer` column for the related entity's identifier.
#### Using the `field` Option for Custom Relation Configuration
When you require additional customization for a relation field in Remult, you can utilize the field option to specify additional options for the related field.
```typescript
@Relations.toOne(() => Customer, {
field: "customerId", // [!code ++]
caption: "The Customer",
})
```
In this example, we use the `field` option to define a custom relation between the `Order` and `Customer` entities. Here are some key points to understand about using the `field` option:
1. **Custom Relation Field**: The `field` option allows you to specify a custom field name (e.g., `"customerId"`) that represents the relationship between entities. This field can be used to access related data without loading the entire related entity.
2. **Additional Configuration**: In addition to specifying the `field`, you can include other options as well. In this example, we set the `caption` option to provide a descriptive caption for the relation field.
Using the `field` option provides you with granular control over how the relation field is configured and accessed . You can customize various aspects of the relation to meet your specific requirements, enhance documentation, and improve the overall usability of your codebase.
### Relation Based on Multiple Fields
In some scenarios, establishing a relation between entities requires considering multiple fields to ensure the correct association. Remult provides the flexibility to define relations based on multiple fields using the `fields` option. Here's how to create a relation based on multiple fields in Remult:
#### Defining Entities
Let's consider a scenario where both `Order` and `Customer` entities belong to specific branches, and we need also the `branchId` fields to ensure the correct association. First, define your entities with the relevant fields:
```typescript{0}
@Entity('customers')
export class Customer {
@Fields.cuid()
id = ''
@Fields.number() // [!code ++]
branchId = 0 // [!code ++]
@Fields.string()
name = ''
@Fields.string()
city = ''
}
@Entity('orders')
export class Order {
@Fields.cuid()
id = ''
@Fields.number() // [!code ++]
branchId = 0 // [!code ++]
@Fields.string({ dbName: 'customer' })
customerId = ''
@Relations.toOne(() => Customer, {
fields: {//[!code ++]
branchId: 'branchId', // Field from Customer entity : Field from Order// [!code ++]
id: 'customerId', // [!code ++]
}, // [!code ++]
})
customer?: Customer
@Fields.number()
amount = 0
}
```
In this example, we have two entities: `Customer` and `Order`. Both entities have a `branchId` field that represents the branch they belong to. To create a relation based on these fields, we specify the `fields` option in the relation configuration.
#### Using the `fields` Option
In the `@Relations.toOne` decorator, use the `fields` option to specify the mapping between fields in the related entity (`Customer`) and your entity (`Order`). Each entry in the `fields` object corresponds to a field in the related entity and maps it to a field in your entity.
```typescript
@Relations.toOne(() => Customer, {
fields: {// [!code ++]
branchId: 'branchId', // Field from Customer entity : Field from Order// [!code ++]
id: 'customerId',// [!code ++]
},// [!code ++]
})
customer?: Customer;
```
In this configuration:
- `branchId` from the `Customer` entity is mapped to `branchId` in the `Order` entity.
- `id` from the `Order` entity is mapped to `customerId` in the `Customer` entity.
This ensures that the relation between `Order` and `Customer` is based on both the `branchId` and `customerId` fields, providing a comprehensive association between the entities.
By utilizing the `fields` option, you can create relations that consider multiple fields, ensuring accurate and meaningful associations between your entities in Remult.
## One-to-Many
In Remult, you can easily define a `toMany` relation to retrieve multiple related records. Let's consider a scenario where you want to retrieve a list of orders for each customer. We'll start with the basic `toOne` relation example and then add a `toMany` relation to achieve this:
#### Basic `toOne` Relation Example
First, let's define the `Customer` and `Order` entities with a basic `toOne` relation:
```typescript{9-10}
@Entity("customers")
export class Customer {
@Fields.cuid()
id = "";
@Fields.string()
name = "";
@Fields.string()
city = "";
}
@Entity("orders")
export class Order {
@Fields.cuid()
id = "";
@Relations.toOne(() => Customer)
customer?: Customer;
@Fields.number()
amount = 0;
}
```
In this initial setup:
- The `Order` entity has a property `customer`, which is decorated with `@Relations.toOne(() => Customer)`. This establishes a relation between an order and its associated customer.
### Adding a `toMany` Relation
Now, let's enhance this setup to include a `toMany` relation that allows you to retrieve a customer's orders:
```typescript
@Entity('customers')
export class Customer {
@Fields.cuid()
id = ''
@Fields.string()
name = ''
@Fields.string()
city = ''
@Relations.toMany(() => Order) // [!code ++]
orders?: Order[] // [!code ++]
}
```
In this updated configuration:
- The `Customer` entity has a property `orders`, which is decorated with `@Relations.toMany(() => Order)`. This indicates that a customer can have multiple orders.
With this setup, you can use the `orders` property of a `Customer` entity to retrieve all the orders associated with that customer. This provides a convenient way to access and work with a customer's orders.
By defining a `toMany` relation, you can easily retrieve and manage multiple related records, such as a customer's orders.
### Fetching Relational Data
To retrieve customers along with their associated order in Remult, you can use the `include` option in your query. Let's see how to fetch customers with their orders using the `include` option:
```typescript
const customerRepo = remult.repo(Customer)
const customers = await customerRepo.find({
include: {
orders: true,
},
})
```
In this code snippet:
- We first obtain a repository for the `Customer` entity using `remult.repo(Customer)`.
- Next, we use the `find` method to query the `Customer` entity. Within the query options, we specify the `include` option to indicate that we want to include related records.
- Inside the `include` option, we specify `orders: true`, indicating that we want to fetch the associated orders for each customer.
As a result, the `customers` variable will contain an array of customer records, with each customer's associated orders included. This allows you to easily access and work with both customer and order data.
#### Resulting Data Structure
When you fetch customers along with their associated orders using the `include` option in Remult, the result will be an array that includes both customer and order data.
Here's an example result of running `JSON.stringify` on the `customers` array:
```json
[
{
"id": "ik68p3oxqg1ygdffpryqwkpw",
"name": "Fay, Ebert and Sporer",
"city": "London",
"orders": [
{
"id": "m7m3xqyx4kwjaqcd0cu33q8g",
"amount": 15
},
{
"id": "rbkcrz6nc45zn4xfxmjise21",
"amount": 10
}
]
}
]
```
In this example:
- Each customer is represented as an object with properties such as `id`, `name`, and `city`.
- The `orders` property within each customer object contains an array of associated order records.
- Each order record within the `orders` array includes properties like `id` and `amount`.
This structured result allows you to easily navigate and manipulate the data . You can access customer information as well as the details of their associated orders, making it convenient to work with related records in your application's logic and UI.
### Specifying Reference Fields
In Remult, you can specify a field or fields for `toMany` relations to have more control over how related records are retrieved. This can be useful when you want to customize the behavior of the relation. Here's how you can specify a field or fields for `toMany` relations:
#### Specifying a Single Field
To specify a single field for a `toMany` relation, you can use the `field` option. This option allows you to define the field in your entity that establishes the relation. For example:
```typescript
@Relations.toMany(() => Order, {
field: "customer",
})
```
In this case, the `field` option is set to `"customer"`, indicating that the `customer` field in the `Order` entity establishes the relation between customers and their orders.
#### Specifying Multiple Fields
In some cases, you may need to specify multiple fields to establish a `toMany` relation. To do this, you can use the `fields` option, which allows you to define a mapping of fields between entities. Here's an example:
```typescript
@Relations.toMany(() => Order, {
fields: {
branchId: "branchId",
customerId: "id",
},
})
```
In this example, the `fields` option is used to specify that the `branchId` field in the `Order` entity corresponds to the `branchId` field in the `Customer` entity, and the `customerId` field in the `Order` entity corresponds to the `id` field in the `Customer` entity.
By specifying fields in this manner, you have fine-grained control over how the relation is established and how related records are retrieved. This allows you to tailor the behavior of `toMany` relations to your specific use case and data model.
### Customizing a `toMany` Relation
In Remult, you can exercise precise control over a `toMany` relation by utilizing the `findOptions` option. This option allows you to define specific criteria and behaviors for retrieving related records. Here's how you can use `findOptions` to fine-tune a `toMany` relation:
```typescript
@Relations.toMany(() => Order, {
fields: {
branchId: "branchId",
customerId: "id",
},
findOptions: {
limit: 5,
orderBy: {
amount: "desc",
},
where: {
amount: { $gt: 10 },
},
},
})
```
In this example, we've specified the following `findOptions`:
- `limit: 5`: Limits the number of related records to 5. Only the first 5 related records will be included.
- `orderBy: { amount: "desc" }`: Orders the related records by the `amount` field in descending order. This means that records with higher `amount` values will appear first in the result.
- `where: { amount: { $gt: 10 } }`: Applies a filter to include only related records where the `amount` is greater than 10. This filters out records with an `amount` of 10 or lower.
By using `findOptions` in this manner, you gain precise control over how related records are retrieved and included in your query results. This flexibility allows you to tailor the behavior of the `toMany` relation to suit your specific application requirements and use cases.
#### Fine-Tuning a `toMany` Relation with `include`
In Remult, you can exercise even more control over a `toMany` relation by using the `include` option within your queries. This option allows you to further customize the behavior of the relation for a specific query. Here's how you can use `include` to fine-tune a `toMany` relation:
```typescript
const orders = await customerRepo.find({
include: {
orders: {
limit: 10,
where: {
completed: true,
},
},
},
})
```
In this code snippet:
- We use the `include` option within our query to specify that we want to include the related `orders` for each customer.
- Inside the `include` block, we can provide additional options to control the behavior of this specific inclusion. For example:
- `limit: 10` limits the number of related orders to 10 per customer. This will override the `limit` set in the original relation.
- `where: { completed: true }` filters the included orders to only include those that have been marked as completed.
The `where` option specified within `include` will be combined with the `where` conditions defined in the `findOptions` of the relation using an "and" relationship. This means that both sets of conditions must be satisfied for related records to be included.
Using `include` in this way allows you to fine-tune the behavior of your `toMany` relation to meet the specific requirements of each query, making Remult a powerful tool for building flexible and customized data retrieval logic in your application.
## Repository `relations`
In Remult, managing relationships between entities is a crucial aspect of working with your data. When dealing with a `toMany` relationship, Remult provides you with powerful tools through the repository's `relations` property to handle related rows efficiently, whether you want to retrieve them or insert new related records.
### Inserting Related Records
Consider a scenario where you have a `Customer` entity with a `toMany` relationship to `Order` entities. You can create a new customer and insert related orders in a straightforward manner:
```typescript
const customer = await customerRepo.insert({ name: 'Abshire Inc' })
await customerRepo.relations(customer).orders.insert([
{
amount: 5,
},
{
amount: 7,
},
])
```
In this example, you first create a new `Customer` entity with the name "Abshire Inc." Then, using the `relations` method, you access the related `orders`. By calling the `insert` method on the `orders` relation, you can add new order records. Remult automatically sets the `customer` field for these orders based on the specific customer associated with the `relations` call.
### Loading Unfetched Relations
Another powerful use of the `repository` methods is to load related records that were not initially retrieved. Let's say you have found a specific customer and want to access their related orders:
```typescript
const customerRepo = remult.repo(Customer)
const customer = await customerRepo.findFirst({ name: 'Abshire Inc' })
const orders = await customerRepo.relations(customer).orders.find()
```
Here, you first search for a customer with the name "Abshire Inc." After locating the customer, you can use the `relations` method again to access their related orders. By calling the `find` method on the `orders` relation, you retrieve all related order records associated with the customer.
#### Contextual Repository: Tailored Operations for Related Data
The `relations` method serves as a specialized repository, tightly associated with the particular customer you supply to it. This dedicated repository offers a tailored context for performing operations related to the specific customer's connection to orders. It enables you to seamlessly find related records, insert new ones, calculate counts, and perform other relevant actions within the precise scope of that customer's relationship with orders. This versatile capability streamlines the management of intricate relationships in your application, ensuring your data interactions remain organized and efficient.
Remult's repository methods empower you to seamlessly manage and interact with related data, making it easier to work with complex data structures and relationships in your applications. Whether you need to insert related records or load unfetched relations, these tools provide the flexibility and control you need to handle your data efficiently.
Certainly, here's an extension of the "Loading Unfetched Relations" section that covers the topic of fetching unloaded `toOne` relations using the `findOne` function:
---
### Fetching Unloaded `toOne` Relations with `findOne`
In addition to loading unfetched `toMany` relations, Remult offers a convenient way to retrieve `toOne` relations that were not initially loaded. This capability is especially useful when dealing with many-to-one relationships.
Consider the following example, where we have a many-to-one relation between orders and customers. We want to fetch the customer related to a specific order, even if we didn't load it initially:
```ts
const orderRepo = remult.repo(Order)
const order = await orderRepo.findFirst({ id: 'm7m3xqyx4kwjaqcd0cu33q8g' })
const customer = await orderRepo.relations(order).customer.findOne()
```
In this code snippet:
1. We first obtain the order using the `findFirst` function, providing the order's unique identifier.
2. Next, we use the `relations` method to access the repository's relations and then chain the `customer` relation using dot notation.
3. Finally, we call `findOne()` on the `customer` relation to efficiently retrieve the related customer information.
This approach allows you to access and load related data on-demand, providing flexibility and control over your data retrieval process. Whether you're working with loaded or unloaded relations, Remult's intuitive functions give you the power to seamlessly access the data you need.
---
You can seamlessly incorporate this extension into the "Loading Unfetched Relations" section of your documentation to provide a comprehensive overview of working with both `toMany` and `toOne` relations.
---
### Accessing Relations with `activeRecord`
If you're following the `activeRecord` pattern and your entity inherits from `EntityBase` or `IdEntity`, you can access relations directly from the entity instance. This approach offers a convenient and straightforward way to work with relations.
#### Inserting Related Records
You can insert related records directly from the entity instance. For example, consider a scenario where you have a `Customer` entity and a `toMany` relation with `Order` entities. Here's how you can insert related orders for a specific customer:
```ts
const customer = await customerRepo.insert({ name: 'Abshire Inc' })
await customer._.relations.orders.insert([
{
amount: 5,
},
{
amount: 7,
},
])
```
In this code:
- We create a new `Customer` instance using `customerRepo.insert()` and set its properties.
- Using `customer._.relations.orders`, we access the `orders` relation of the customer.
- We insert two orders related to the customer by calling `.insert()` on the `orders` relation.
#### Retrieving Related Records
Fetching related records is just as straightforward. Let's say you want to find a customer by name and then retrieve their related orders:
```ts
const customer = await customerRepo.findFirst({ name: 'Abshire Inc' })
const orders = await customer._.relations.orders.find()
```
In this code:
- We search for a customer with the specified name using `customerRepo.findFirst()`.
- Once we have the customer instance, we access their `orders` relation with `customer._.relations.orders`.
- We use `.find()` to retrieve all related orders associated with the customer.
Using the `activeRecord` pattern and direct access to relations simplifies the management of related data, making it more intuitive and efficient.
## Many-to-Many
In Remult, you can effectively handle many-to-many relationships between entities by using an intermediate table. This approach is especially useful when you need to associate multiple instances of one entity with multiple instances of another entity. In this section, we'll walk through the process of defining and working with many-to-many relationships using this intermediate table concept.
#### Entity Definitions:
To illustrate this concept, let's consider two entities: `Customer` and `Tag`. In this scenario, multiple customers can be associated with multiple tags.
```ts
@Entity('customers')
export class Customer {
@Fields.cuid()
id = ''
@Fields.string()
name = ''
@Fields.string()
city = ''
}
@Entity('tags')
export class Tag {
@Fields.cuid()
id = ''
@Fields.string()
name = ''
}
```
### Intermediate Table
To establish this relationship, we'll create an intermediate table called `tagsToCustomers`. In this table, both `customerId` and `tagId` fields are combined as the primary key.
```ts
@Entity('tagsToCustomers', {
id: {
customerId: true,
tagId: true,
},
})
export class TagsToCustomers {
@Fields.string()
customerId = ''
@Fields.string()
tagId = ''
@Relations.toOne(() => Tag, 'tagId')
tag?: Tag
}
```
- To uniquely identify associations between customers and tags in a many-to-many relationship, we use the combined `customerId` and `tagId` fields as the primary key, specified using the 'id' option in the `@Entity` decorator.
- In this scenario, we've defined a `toOne` relation to the `Tag` entity within the `TagsToCustomers` entity to efficiently retrieve tags associated with a specific customer. This approach simplifies the management of many-to-many relationships while ensuring unique identification of each association.
Now, let's enhance our customer entity with a toMany relationship, enabling us to fetch all of its associated tags effortlessly.
```ts
@Entity('customers')
export class Customer {
@Fields.cuid()
id = ''
@Fields.string()
name = ''
@Fields.string()
city = ''
@Relations.toMany(() => TagsToCustomers, 'customerId') // [!code ++]
tags?: TagsToCustomers[] // [!code ++]
}
```
### Working with Many-to-Many Relationships
Let's explore how to interact with many-to-many relationships using an intermediate table in Remult.
#### 1. Adding Tags to a Customer:
To associate a tag with a customer, consider the follow code:
```ts
const tags = await remult
.repo(Tag)
.insert([
{ name: 'vip' },
{ name: 'hot-lead' },
{ name: 'influencer' },
{ name: 'manager' },
]) // Create the tags
const customerRepo = remult.repo(Customer)
const customer = await customerRepo.findFirst({ name: 'Abshire Inc' })
await customerRepo
.relations(customer)
.tags.insert([{ tag: tags[0] }, { tag: tags[2] }])
```
Here's an explanation of what's happening in this code:
1. We first insert some tags into the "tags" entity.
2. We then create a repository instance for the "customer" entity using `remult.repo(Customer)`.
3. We retrieve a specific customer by searching for one with the name "Abshire Inc" using `customerRepo.findFirst({ name: "Abshire Inc" })`. The `customer` variable now holds the customer entity.
4. To associate tags with the customer, we use the `relations` method provided by the repository. This method allows us to work with the customer's related entities, in this case, the "tags" relation to the TagsToCustomers entity.
5. Finally, we call the `insert` method on the "tags" relationship and provide an array of tag objects to insert. In this example, we associate the customer with the "vip" tag and the "influencer" tag by specifying the tags' indices in the `tags` array.
**2. Retrieving Tags for a Customer:**
To fetch the tags associated with a specific customer:
Certainly, here's a shorter explanation:
```ts
const customer = await customerRepo.findFirst(
{ name: 'Abshire Inc' },
{
include: {
tags: {
include: {
tag: true,
},
},
},
},
)
```
In this code, we're querying the "customer" entity to find a customer named "Abshire Inc." We're also including the related "tags" for that customer, along with the details of each tag. This allows us to fetch both customer and tag data in a single query, making it more efficient when working with related entities.
### Resulting Data Structure
Here's an example result of running `JSON.stringify` on the `customer` object:
```json
{
"id": "fki6t24zkykpljvh4jurzs97",
"name": "Abshire Inc",
"city": "New York",
"tags": [
{
"customerId": "fki6t24zkykpljvh4jurzs97",
"tagId": "aewm0odq9758nopgph3x7brt",
"tag": {
"id": "cf8xv3myluc7pmsgez3p9hn9",
"name": "vip"
}
},
{
"customerId": "fki6t24zkykpljvh4jurzs97",
"tagId": "aewm0odq9758nopgph3x7brt",
"tag": {
"id": "aewm0odq9758nopgph3x7brt",
"name": "influencer"
}
}
]
}
```
Utilizing an intermediate table for managing many-to-many relationships in Remult allows for a flexible and efficient approach to handle complex data associations. Whether you are connecting customers with tags or other entities, this method provides a powerful way to maintain data integrity and perform queries effectively within your application.
---
In this guide, we've explored the essential concepts of managing entity relations within the Remult library. From one-to-one to many-to-many relationships, we've covered the declaration, customization, and querying of these relations. By understanding the nuances of entity relations, users can harness the full potential of Remult to build robust TypeScript applications with ease.
# Relations 🚀 - Filtering and Relations
# Filtering and Relations
::: tip **Interactive Learning Available! 🚀**
Looking to get hands-on with this topic? Try out our new [**interactive tutorial**](https://learn.remult.dev/in-depth/4-filtering/1-custom-filters) on Filtering relations, where you can explore and practice directly in the browser. This guided experience offers step-by-step lessons to help you master filtering in Remult with practical examples and exercises.
[Click here to dive into the interactive tutorial on Filtering and Relations!](https://learn.remult.dev/in-depth/4-filtering/1-custom-filters)
:::
In this article, we'll discuss several relevant techniques for one-to-many relations.
Consider the following scenario where we have a customer entity and an Orders entity.
We'll use the following entities and data for this article.
```ts
import { Entity, Field, Fields, remult, Relations } from 'remult'
@Entity('customers')
export class Customer {
@Fields.autoIncrement()
id = 0
@Fields.string()
name = ''
@Fields.string()
city = ''
@Relations.toMany(() => Order)
orders?: Order[]
}
@Entity('orders')
export class Order {
@Fields.autoIncrement()
id = 0
@Relations.toOne(() => Customer)
customer!: Customer
@Fields.number()
amount = 0
}
```
::: tip Use Case in this article
Let's say that we want to filter all the orders of customers who are in London.
Let's have a look at the different options to achieve this.
:::
## Option 1 - Use In Statement
Add the `where` inline to the `find` method.
```ts
console.table(
await repo(Order).find({
where: {
customer: await repo(Customer).find({
where: {
city: 'London',
},
}),
},
}),
)
```
## Option 2 - Use Custom Filter
We can refactor this to a custom filter that will be easier to use and will run on the backend
```ts
import { Filter } from 'remult'
@Entity('orders', { allowApiCrud: true })
export class Order {
//...
static filterCity = Filter.createCustom(
async ({ city }) => ({
customer: await repo(Customer).find({ where: { city } }),
}),
)
}
```
And then we can use it:
```ts
console.table(
await repo(Order).find({
where: Order.filterCity({
city: 'London',
}),
}),
)
```
## Option 3 - Custom Filter (SQL)
We can improve on the custom filter by using the database's in statement capabilities:
```ts
import { SqlDatabase } from 'remult'
@Entity('orders', { allowApiCrud: true })
export class Order {
//...
static filterCity = Filter.createCustom(
async ({ city }) =>
SqlDatabase.rawFilter(
({ param }) =>
`customer in (select id from customers where city = ${param(city)})`,
),
)
}
```
We can also reuse the entity definitions by using `dbNamesOf` and `filterToRaw`
```ts
import { dbNamesOf } from 'remult'
@Entity('orders', { allowApiCrud: true })
export class Order {
//...
static filterCity = Filter.createCustom(
async ({ city }) => {
const orders = await dbNamesOf(Order)
const customers = await dbNamesOf(Customer)
return SqlDatabase.rawFilter(
async ({ filterToRaw }) =>
`${orders.customer} in
(select ${customers.id}
from ${customers}
where ${await filterToRaw(Customer, { city })})`,
)
},
)
}
```
## Option 4 - sqlExpression field
```ts
@Entity('orders', { allowApiCrud: true })
export class Order {
//...
@Fields.string({
sqlExpression: async () => {
const customer = await dbNamesOf(Customer)
const order = await dbNamesOf(Order)
return `(
select ${customer.city}
from ${customer}
where ${customer.id} = ${order.customer}
)`
},
})
city = ''
}
```
- This adds a calculated `city` field to the `Order` entity that we can use to order by or filter
```ts
console.table(
await repo(Order).find({
where: {
city: 'London',
},
}),
)
```
::: details Side Note
In this option, `city` is always calculated, and the `sqlExpression` is always executed. Not a big deal, but it's woth mentioning. (Check out Option 5 for a solution)
:::
## Option 5 - Dedicated entity
```ts
export class OrderWithCity extends Order {
@Fields.string({
sqlExpression: async () => {
const customer = await dbNamesOf(Customer)
const order = await dbNamesOf(Order)
return `(
select ${customer.city}
from ${customer}
where ${customer.id} = ${order.customer}
)`
},
})
city = ''
}
```
Like this, in your code, you can use `OrderWithCity` or `Order` depending on your needs.
::: tip
As `OrderWithCity` extends `Order`, everything in `Order` is also available in `OrderWithCity` 🎉.
:::
# Entities - Lifecycle Hooks
# Entity Lifecycle Hooks
In Remult, you can take advantage of Entity Lifecycle Hooks to add custom logic and actions at specific stages of an entity's lifecycle. There are five lifecycle events available: `validation`, `saving`, `saved`, `deleting`, and `deleted`. These hooks allow you to perform actions or validations when specific events occur in the entity's lifecycle.
## Validation
- **Runs On**: Backend and Frontend.
- **Purpose**: To perform validations on the entity's data before saving.
- **Example**:
```ts
@Entity("tasks", {
validation: async (task, e) => {
if (task.title.length < 5) {
throw new Error("Task title must be at least 5 characters long.");
}
},
})
```
You can run custom validation like in this example, and you can also use [builtin validation](./validation.md).
## Saving
- **Runs On**: Backend (or Frontend if using a local frontend database).
- **Purpose**: To execute custom logic before an entity is saved.
- **Example**:
```ts
@Entity("tasks", {
saving: async (task, e) => {
if (e.isNew) {
task.createdAt = new Date(); // Set the creation date for new tasks.
}
task.lastUpdated = new Date(); // Update the last updated date.
},
})
```
## Saved
- **Runs On**: Backend (or Frontend if using a local frontend database).
- **Purpose**: To perform actions after an entity has been successfully saved.
- **Example**: Useful for triggering additional processes or updates after saving.
## Deleting
- **Runs On**: Backend (or Frontend if using a local frontend database).
- **Purpose**: To execute custom logic before an entity is deleted.
- **Example**: You can use this to ensure related data is properly cleaned up or archived.
## Deleted
- **Runs On**: Backend (or Frontend if using a local frontend database).
- **Purpose**: To perform actions after an entity has been successfully deleted.
- **Example**: Similar to the `saved` event, this is useful for any post-deletion processes.
## Field Saving Hook
Additionally, you can define a field-specific `saving` hook that allows you to perform custom logic on a specific field before the entity `saving` hook. It has the following signature:
```ts
@Fields.Date({
saving: (task, fieldRef, e) => {
if (e.isNew) task.createdAt = new Date()
},
})
createdAt = new Date()
```
or using the fieldRef
```ts
@Fields.Date({
saving: (_, fieldRef, e) => {
if (e.isNew) fieldRef.value = new Date()
},
})
createdAt = new Date()
```
You can use the field `saving` hook to perform specialized actions on individual fields during the entity's saving process.
## Lifecycle Event Args
Each lifecycle event receives an instance of the relevant entity and an event args of type `LifecycleEvent`. The `LifecycleEvent` object provides various fields and methods to interact with the entity and its context. Here are the fields available in the `LifecycleEvent`:
- `isNew`: A boolean indicating whether the entity is new (being created).
- `fields`: A reference to the entity's fields, allowing you to access and modify field values.
- `id`: The ID of the entity.
- `originalId`: The original ID of the entity, which may differ during certain operations.
- `repository`: The repository associated with the entity.
- `metadata`: The metadata of the entity, providing information about its structure.
- `preventDefault()`: A method to prevent the default behavior associated with the event.
- `relations`: Access to repository relations for the entity, allowing you to work with related data.
## Example Usage
Here's an example of how to use Entity Lifecycle Hooks to add custom logic to the `saving` event:
```ts
@Entity("tasks", {
saving: async (task, e) => {
if (e.isNew) {
task.createdAt = new Date(); // Set the creation date for new tasks.
}
task.lastUpdated = new Date(); // Update the last updated date.
},
})
```
In this example, we've defined a `saving` event for the `Task` entity. When a task is being saved, the event handler is called. If the task is new (not yet saved), we set its `createdAt` field to the current date. In either case, we update the `lastUpdated` field with the current date.
Entity Lifecycle Hooks provide a powerful way to customize the behavior of your entities and ensure that specific actions or validations are performed at the right time in the entity's lifecycle. You can use these hooks to streamline your application's data management and enforce business rules.
# Entities - Migrations
# Migrations
Managing database schemas is crucial in web development. Traditional migration approaches introduce complexity and risks. Remult, designed for data-driven web apps with TypeScript, offers a simpler method.
## You Don't Necessarily Need Migrations
Migration files are standard but can complicate database schema management. They're prone to errors, potentially leading to conflicts or downtime. Remult proposes a streamlined alternative: automatic schema synchronization. This approach simplifies schema management by ensuring your database schema aligns with your application code without the manual overhead of traditional migrations.
### Embracing Schema Synchronization with Remult
Remult offers an alternative: automatic schema synchronization. **By default, Remult checks for and synchronizes your database schema with the entity types** provided in the `RemultServerOptions.entities` property when the server loads. This feature automatically adds any missing tables or columns, significantly simplifying schema management.
::: tip No Data Loss with Remult's Safe Schema Updates
**Remult's schema synchronization** ensures **safe and automatic updates** to your database schema. By only adding new tables or columns without altering existing ones, Remult prevents data loss. This design offers a secure way to evolve your application's database schema.
:::
#### Disabling Automatic Schema Synchronization
For manual control, Remult allows disabling automatic schema synchronization:
```typescript
const api = remultExpress({
entities: [], // Your entities here
ensureSchema: false, // Disables automatic schema synchronization, Default: true
})
```
#### Manually Triggering Schema Synchronization
In certain scenarios, you might want to manually trigger the `ensureSchema` function to ensure that your database schema is up-to-date with your entity definitions. Here's how you can do it:
```ts
remult.dataProvider.ensureSchema!(entities.map((x) => remult.repo(x).metadata))
```
## Quick Start: Introducing Migrations to Your Application
Introducing migrations to your Remult application involves a few straightforward steps. The goal is to ensure that your migrations and API share the same data provider and entity definitions. Here's how you can do it:
### 1. Refactor Your Configuration
Start by refactoring the `dataProvider` and `entities` definitions from the `api.ts` file to a new file named `src/server/config.ts`. This allows you to use the same configurations for both your API and migrations.
In your `src/server/config.ts` file, define your entities and data provider as follows:
```ts
import { createPostgresDataProvider } from 'remult/postgres'
import { Task } from '../shared/task'
export const entities = [Task /* ...other entities */]
export const dataProvider = createPostgresDataProvider({
connectionString: 'your connection string',
})
```
:::tip Using environment variables
In most cases, the connection string for your database will not be hard-coded but stored in an environment variable for security and flexibility. A common practice is to use a `.env` file to store environment variables in development and load them using the `dotenv` npm package. Here's how you can set it up:
1. Install the `dotenv` package:
```sh
npm install dotenv
```
2. Create a `.env` file in the root of your project and add your database connection string:
```
DATABASE_URL=your_connection_string
```
3. At the beginning of your `src/server/config.ts` file, load the environment variables:
```ts
import { config } from 'dotenv'
config()
```
4. Access the connection string using `process.env`:
```ts
export const dataProvider = createPostgresDataProvider({
connectionString: process.env['DATABASE_URL'],
})
```
By following these steps, you ensure that your application securely and flexibly manages the database connection string.
:::
### 2. Adjust the API Configuration
Next, adjust your `api.ts` file to use the configurations from the `config.ts` file, and disable the `ensureSchema` migrations:
```ts
import { remultExpress } from 'remult/remult-express'
import { dataProvider, entities } from './config'
export const api = remultExpress({
entities,
dataProvider,
ensureSchema: false,
})
```
### 3. Generate the migration
::: tip Prettier
The migration generator uses `prettier` to format the generated code for better readability and consistency. If you don't already have `prettier` installed in your project, we recommend installing it as a development dependency using the following command:
```sh
npm i -D prettier
```
:::
To enable automatic generation of migration scripts, follow these steps:
1. **Create the Migrations Folder:** In your `src/server` directory, create a new folder named `migrations`. This folder will hold all your migration scripts.
2. **Create the Migration Generator File:** Inside the `migrations` folder, create a file named `generate-migrations.ts`. This file will contain the script that generates migration scripts based on changes in your entities.
Here's the revised section:
3. **Populate the Generator File:** Add the following code to `generate-migrations.ts`:
```ts
import { generateMigrations } from 'remult/migrations'
import { dataProvider, entities } from '../config'
generateMigrations({
dataProvider, // The data provider for your database
entities, // Entity classes to include in the migration
endConnection: true, // Close the database connection after generating migrations (useful for standalone scripts)
})
```
This script generates migration scripts based on changes in your entities. If you're calling this method on a server where the database connection should remain open, omit the `endConnection` parameter or set it to `false`.
4. **Generate Migrations:** To generate the migration scripts, run the `generate-migrations.ts` script using the following command:
```sh
npx tsx src/server/migrations/generate-migrations.ts
```
This command will create two important files:
1. **`migrations-snapshot.json`**: This file stores the last known state of your entities. It helps the migration generator understand what changes have been made since the last migration was generated.
2. **`migrations.ts`**: This file contains the actual migration scripts that need to be run to update your database schema. The structure of this file is as follows:
```ts
import type { Migrations } from 'remult/migrations'
export const migrations: Migrations = {
0: async ({ sql }) => {
await sql(`--sql
CREATE SCHEMA IF NOT EXISTS public;
CREATE TABLE "tasks" (
"id" VARCHAR DEFAULT '' NOT NULL PRIMARY KEY,
"title" VARCHAR DEFAULT '' NOT NULL
)`)
},
}
```
Each migration script is associated with a unique identifier (in this case, `0`) and contains the SQL commands necessary to update the database schema.
By running this script whenever you make changes to your entities, you can automatically generate the necessary migration scripts to keep your database schema in sync with your application's data model.
It's important to note that each migration can include any code that the developer wishes to include, not just SQL statements. The `sql` parameter is provided to facilitate running SQL commands, but you can also include other logic or code as needed. Additionally, developers are encouraged to add their own custom migrations to address specific requirements or changes that may not be covered by automatically generated migrations. This flexibility allows for a more tailored approach to managing database schema changes.
### 4. Run the Migrations
To apply the migrations to your database, you'll need to create a script that executes them.
#### Setting Up the Migration Script
1. **Create the Migration Script:** In the `src/server/migrations` folder, add a file named `migrate.ts`.
2. **Populate the Script:** Add the following code to `migrate.ts`:
```ts
import { migrate } from 'remult/migrations'
import { dataProvider } from '../config'
import { migrations } from './migrations'
migrate({
dataProvider,
migrations,
endConnection: true, // Close the database connection after applying migrations
})
```
This script sets up the migration process. The `migrate` function checks the last migration executed on the database and runs all subsequent migrations based on their index in the `migrations` file. The entire call to `migrate` is executed in a transaction, ensuring that either all required migration steps are executed or none at all, maintaining the integrity of your database schema.
::: warning Warning: Database Transaction Support for Structural Changes
It's important to note that some databases, like MySQL, do not support rolling back structural changes as part of a transaction. This means that if you make changes to the database schema (such as adding or dropping tables or columns) and something goes wrong, those changes might not be automatically rolled back. Developers need to be aware of this limitation and plan their migrations accordingly to avoid leaving the database in an inconsistent state.
Always consult your database's documentation to understand the specifics of transaction support and plan your migrations accordingly.
:::
3. **Execute the Script:** Run the migration script using the following command:
```sh
npx tsx src/server/migrations/migrate.ts
```
## Integrating Migrations into Your Deployment Process
You have a couple of options for when and how to run your migrations:
- **As Part of the Build Step:** You can include the migration script as part of your build or deployment process. This way, if the migration fails, the deployment will also fail, preventing potential issues with an inconsistent database state.
- **During Application Initialization:** Alternatively, you can run the migrations when your application loads by using the `initApi` option in your `api.ts` file:
```ts
// src/server/api.ts
import { remultExpress } from 'remult/remult-express'
import { dataProvider, entities } from './config'
import { migrate } from 'remult/migrations/migrate'
import { migrations } from './migrations/migrations'
import { remult } from 'remult'
export const api = remultExpress({
entities,
dataProvider,
initApi: async () => {
await migrate({
dataProvider: remult.dataProvider,
migrations,
endConnection: false, //it's the default :)
})
},
})
```
This approach ensures that the migrations are applied each time the API initializes. Note that the `migrate` and `generateMigrations` functions typically close the connection used by the `dataProvider` when they complete. In this code, we disable this behavior using the `endConnection: false` option, instructing the `migrate` function to keep the `dataProvider` connection open when it completes.
Choose the approach that best fits your application's deployment and initialization process.
### Migration Philosophy: Embracing Backward Compatibility
We believe in designing migrations with a backward compatibility mindset. This approach ensures that older versions of the code can operate smoothly with newer versions of the database. To achieve this, we recommend:
- Never dropping columns or tables.
- Instead of altering a column, adding a new column and copying the data to it as part of the migration process.
This philosophy minimizes disruptions and ensures a smoother transition during database schema updates.
# Entities - Generate from Existing DB
# Generate Entities from Existing Database
## Remult kit
Want to use Remult for full-stack CRUD with your existing database?
Check out this video to see how to connect http://remult.dev to your existing database and start building type-safe #fullstack apps with any #typescript frontend, backend, and any DB!
Watch now 👉 https://youtu.be/5QCzJEO-qQ0.
# Entities - Offline Support
# Offline Support
In modern web applications, providing a seamless user experience often involves enabling offline functionality. This ensures that users can continue to interact with the application even without an active internet connection. Remult supports several offline databases that can be used to store data in the browser for offline scenarios, enhancing the application's resilience and usability.
## Using Local Database for Specific Calls
To utilize a local database for a specific call, you can pass the `dataProvider` as a second parameter to the `repo` function. This allows you to specify which database should be used for that particular operation.
```typescript
import { localDb } from './some-file.ts'
console.table(await repo(Task, localDb).find())
```
In this example, `localDb` is used as the data provider for the `Task` repository, enabling data fetching from the local database.
## JSON in LocalStorage / SessionStorage
For simple data storage needs, you can use JSON data providers that leverage the browser's `localStorage` or `sessionStorage`.
```typescript
import { JsonDataProvider, Remult } from 'remult'
export const remultLocalStorage = new Remult(new JsonDataProvider(localStorage))
```
This approach is straightforward and suitable for small datasets that need to persist across sessions or page reloads.
## JSON Storage in IndexedDB
For more complex offline storage needs, such as larger datasets and structured queries, `IndexedDB` provides a robust solution. Using Remult’s `JsonEntityIndexedDbStorage`, you can store entities in `IndexedDB`, which is supported across all major browsers. This allows for efficient offline data management while offering support for larger volumes of data compared to `localStorage` or `sessionStorage`.
```typescript
import { JsonDataProvider } from 'remult'
import { JsonEntityIndexedDbStorage } from 'remult'
// Initialize the JsonEntityIndexedDbStorage
const db = new JsonDataProvider(new JsonEntityIndexedDbStorage())
// Use the local IndexedDB to store and fetch tasks
console.table(await repo(Task, db).find())
```
In this example, `JsonEntityIndexedDbStorage` is used to persist the data to `IndexedDB`. This method is ideal for applications with large data sets or those requiring more complex interactions with the stored data in offline mode.
## JSON Storage in OPFS (Origin Private File System)
Origin Private File System (OPFS) is a modern browser feature supported by Chrome and Safari, allowing for more structured and efficient data storage in the frontend.
```typescript
import { JsonDataProvider } from 'remult'
import { JsonEntityOpfsStorage } from 'remult'
const localDb = new JsonDataProvider(new JsonEntityOpfsStorage())
```
Using OPFS with Remult's `JsonDataProvider` provides a robust solution for storing entities in the frontend, especially for applications requiring more complex data handling than what `localStorage` or `sessionStorage` can offer.
Certainly! Here's the adjusted section on `sql.js` with an enriched code sample:
## `sql.js`: A SQLite Implementation for the Frontend
For applications requiring advanced database functionality, [`sql.js`](https://sql.js.org/) provides a SQLite implementation that runs entirely in the frontend. This allows you to use SQL queries and transactions, offering a powerful and flexible data management solution for offline scenarios.
Before using `sql.js` in your project, you need to install the package and its TypeScript definitions. Run the following commands in your terminal:
```bash
npm install sql.js
npm install @types/sql.js --save-dev
```
After installing the necessary packages, you can use the following code sample in your project:
```typescript
import { SqlDatabase } from 'remult'
import { SqlJsDataProvider } from 'remult/remult-sql-js'
import initSqlJs from 'sql.js'
let sqlDb: Database
// Initialize the SqlJsDataProvider with a new database instance
const sqlJsDataProvider = new SqlJsDataProvider(
initSqlJs({
locateFile: (file) => `https://sql.js.org/dist/${file}`, // for complete offline support, change this to a url that is available offline
}).then((x) => {
// Load the database from localStorage if it exists
const dbData = localStorage.getItem('sqljs-db')
if (dbData) {
const buffer = new Uint8Array(JSON.parse(dbData))
return (sqlDb = new x.Database(buffer))
}
return (sqlDb = new x.Database())
}),
)
// Set up an afterMutation hook to save the database to localStorage after any mutation
sqlJsDataProvider.afterMutation = async () => {
const db = sqlDb
const buffer = db.export()
localStorage.setItem('sqljs-db', JSON.stringify([...buffer]))
}
const localDb = new SqlDatabase(sqlJsDataProvider)
```
This code sets up a SQLite database using `sql.js` in your Remult project, with support for saving to and loading from `localStorage`.
## Summary
Remult's support for various offline databases empowers developers to create web applications that provide a seamless user experience, even in offline scenarios. Whether using simple JSON storage in `localStorage` or more advanced solutions like OPFS or `sql.js`, Remult offers the flexibility to choose the right data storage solution for your application's needs. By leveraging these offline capabilities, you can ensure that your application remains functional and responsive, regardless of the user's connectivity status.
# Entities - Active Record & EntityBase
# Mutability and the Active Record Pattern
The Active Record pattern is a concept in software architecture, particularly useful when working with mutable objects whose state may change over time. This design pattern facilitates direct interaction with the database through the object representing a row of the data table. In this article, we'll delve into the fundamentals of the Active Record pattern, contrasting it with immutable patterns, and exploring its implementation and advantages in software development.
### Immutable vs. Mutable Patterns
In modern software development, handling data objects can generally be approached in two ways: immutable and mutable patterns.
**Immutable objects** do not change once they are created. Any modification on an immutable object results in a new object. For example, in the React framework, immutability is often preferred:
```typescript
// Immutable update
const updatePerson = { ...person, name: 'newName' }
```
However, libraries like MobX offer the flexibility to work with mutable objects while still providing the reactivity that React components need.
**Mutable objects**, on the other hand, allow for changes directly on the object itself:
```typescript
// Mutable update
person.name = 'newName'
```
Mutable patterns are especially prevalent in scenarios where the state of objects changes frequently, making them a staple in many programming environments outside of React.
### The Role of Active Record Pattern
The Active Record pattern embodies the concept of mutability by binding business logic to object data models. Typically, each model instance corresponds to a row in the database, with the class methods providing the functionality to create, read, update, and delete records.
### Warning: Mutable Objects in React
Using mutable objects with the Active Record pattern in React (without libraries like MobX) requires careful handling. React’s rendering cycle is built around the premise of immutability; it typically relies on immutable state management to trigger re-renders. When mutable objects change state outside the scope of React's `useState` or `useReducer`, React does not automatically know to re-render the affected components. This can lead to issues where the UI does not reflect the current application state.
These challenges can be mitigated by integrating state management tools that are designed to work well with mutable objects, such as MobX. MobX provides mechanisms to track changes in data and automatically re-render components when mutations occur. This aligns more naturally with the Active Record pattern within the context of React, ensuring that the UI stays in sync with the underlying data.
#### Using EntityBase and IdEntity
In practice, leveraging the Active Record pattern often involves inheriting from classes such as `EntityBase` or `IdEntity` (a variant of `EntityBase` with a UUID as the identifier). These base classes enrich models with methods that simplify manipulations of their attributes and their persistence in the database.
```typescript
@Entity('people')
export class Person extends IdEntity {
@Fields.string()
name = ''
}
```
**Explanation:**
The `Person` class represents individuals in the 'people' table and inherits from `IdEntity`. This inheritance means that there is no need to explicitly define an `id` field for this class, as `IdEntity` automatically includes a UUID field (`id`). Consequently, `Person` benefits from all the functionalities of `EntityBase`, which include tracking changes and handling CRUD operations, while also automatically gaining a UUID as the identifier.
### Mutable vs EntityBase (active-record)
**Traditional approach without Active Record:**
```typescript
// Updating a person's name the traditional way
await repo(Person).update(person, { name: 'newName' })
```
**Using Active Record with EntityBase:**
```typescript
// Active Record style
person.name = 'newName'
await person.save()
```
This pattern also simplifies other operations:
```typescript
// Deleting a record
await person.delete()
// Checking if the record is new
if (person.isNew()) {
// Perform a specific action
}
```
#### Helper Members in EntityBase
EntityBase provides additional utility members like `_` and `$` to facilitate more complex interactions:
- **`_` (EntityRef Object):** Allows performing operations on a specific instance of an entity.
```typescript
await person._.reload()
```
- **`$` (FieldsRef):** Provides access to detailed information about each field in the current instance, such as their original and current values:
```typescript
// Logging changes in a field
console.log(
`Name changed from "${person.$.name.originalValue}" to "${person.name}"`,
)
```
### Alternative Implementations
Even without direct inheritance from `EntityBase`, similar functionalities can be achieved using helper functions such as `getEntityRef`, which encapsulates an entity instance for manipulation and persistence:
```typescript
const ref = getEntityRef(person)
await ref.save()
```
### Conclusion
The Active Record pattern offers a straightforward and intuitive approach to interacting with database records through object-oriented models. It is particularly beneficial in environments where business logic needs to be tightly coupled with data manipulation, providing a clear and efficient way to handle data state changes. However, integrating the Active Record pattern with mutable objects in React can be challenging.
# Active Record & EntityBase - Entity Backend Methods
# Entity Instance Backend Methods
When leveraging the Active Record pattern, backend methods for entity instances offer a powerful way to integrate client-side behavior with server-side logic. These methods, when invoked, transport the entire entity's state from the client to the server and vice versa, even if the data has not yet been saved. This feature is particularly useful for executing entity-specific operations that require a round-trip to the server to maintain consistency and integrity.
## Overview of Entity Backend Methods
Entity backend methods enable all the fields of an entity, including unsaved values, to be sent to and from the server during the method's execution. This approach is essential for operations that rely on the most current state of an entity, whether or not the changes have been persisted to the database.
### Defining a Backend Method
To define a backend method, use the `@BackendMethod` decorator to annotate methods within an entity class. This decorator ensures that the method is executed on the server, taking advantage of server-side resources and permissions.
Here is an example demonstrating how to define and use a backend method in an entity class:
```typescript
@Entity('tasks', {
allowApiCrud: true,
})
export class Task extends IdEntity {
@Fields.string()
title = ''
@Fields.boolean()
completed = false
@BackendMethod({ allowed: true })
async toggleCompleted() {
this.completed = !this.completed
console.log({
title: this.title,
titleOriginalValue: this.$.title.originalValue,
})
await this.save()
}
}
```
### Calling the Backend Method from the Frontend
Once the backend method is defined, it can be called from the client-side code. This process typically involves fetching an entity instance and then invoking the backend method as shown below:
```typescript
const task = await remult.repo(Task).findFirst()
await task.toggleCompleted()
```
### Security Considerations
::: danger
It's important to note that backend methods bypass certain API restrictions that might be set on the entity, such as `allowApiUpdate=false`. This means that even if an entity is configured not to allow updates through standard API operations, it can still be modified through backend methods if they are permitted by their `allowed` setting. Consequently, developers must explicitly handle security and validation within these methods to prevent unauthorized actions.
The principle here is that if a user has permission to execute the `BackendMethod`, then all operations within that method are considered authorized. It is up to the developer to implement any necessary restrictions within the method itself.
:::
# Active Record & EntityBase - Mutable Controllers
# Introduction to Mutable Controllers and Backend Methods
In web development architectures, mutable controllers offer a convenient way to manage state and facilitate interactions between the client (frontend) and the server (backend). These controllers are useful in scenarios where state needs to be maintained and manipulated across server calls, providing a streamlined approach to handling data.
## Overview of Controller Backend Methods
A Controller is a class designed to encapsulate business logic and data processing. When a backend method in a controller is called, it ensures that all field values are preserved and appropriately transferred between the frontend and backend, maintaining state throughout the process.
### Defining a Mutable Controller
The mutable controller is typically defined in a shared module, allowing both the frontend and backend to interact with it efficiently. Below is an example of how to define such a controller and a backend method within it.
### Explanation with Data Flow and Example Usage
This example demonstrates the use of a mutable controller, `UserSignInController`, to handle the sign-in process for users in a web application. Let's break down the key components of this example:
1. **Controller Definition**: The `UserSignInController` is a class annotated with `@Controller('UserSignInController')`, indicating that it serves as a controller for handling user sign-in operations.
2. **Data Flow**: When the `signInUser` backend method is called from the frontend, all the values of the controller fields (`email`, `password`, `rememberMe`) will be sent to the backend for processing. Once the method completes its execution, the updated values (if any) will be sent back to the frontend.
### Example Usage
Here's how you can use the `UserSignInController` on the frontend to initiate the sign-in process:
```typescript
const signInController = new UserSignInController()
signInController.email = 'user@example.com'
signInController.password = 'password123'
signInController.rememberMe = true // Optional: Set to true if the user wants to remain logged in
try {
const user = await signInController.signInUser()
console.log(`User signed in: ${user.email}`)
} catch (error) {
console.error('Sign-in failed:', error.message)
}
```
In this example, we create an instance of `UserSignInController` and set its `email`, `password`, and `rememberMe` fields with the appropriate values. We then call the `signInUser` method to initiate the sign-in process. If successful, we log a message indicating that the user has signed in. If an error occurs during the sign-in process, we catch the error and log a corresponding error message.
This usage demonstrates how to interact with the mutable controller to handle user sign-in operations seamlessly within a web application.
### Summary
Mutable controllers and backend methods provide a powerful mechanism for managing state and handling user interactions in web applications. By encapsulating business logic and data processing within controllers, developers can ensure consistent behavior and efficient data flow between the frontend and backend. With the ability to preserve and transfer field values during server calls, mutable controllers facilitate a smooth and responsive user experience, enhancing the overall functionality and performance of web applications.
# Stacks
# Stacks
## Frameworks
## Servers
## Databases
# Stacks - Framework
# Select a framework
# Framework - React
# React
## Create a React Project with Vite
To set up a new React project using Vite, run the following commands:
```sh
npm create vite@latest remult-react-project -- --template react-ts
cd remult-react-project
```
## Install Remult
Install the latest version of Remult:
```bash
npm install remult@latest
```
## Enable TypeScript Decorators in Vite
To enable the use of decorators in your React app, modify the `vite.config.ts` file by adding the following to the `defineConfig` section:
```ts{6-12}
// vite.config.ts
// ...
export default defineConfig({
plugins: [react()],
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
});
```
This configuration ensures that TypeScript decorators are enabled for the project.
## Proxy API Requests from Vite DevServer to the API Server
In development, your React app will be served from `http://localhost:5173`, while the API server will run on `http://localhost:3002`. To allow the React app to communicate with the API server during development, use Vite's [proxy](https://vitejs.dev/config/#server-proxy) feature.
Add the following proxy configuration to the `vite.config.ts` file:
```ts{6}
// vite.config.ts
//...
export default defineConfig({
plugins: [react()],
server: { proxy: { "/api": "http://localhost:3002" } },
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
});
```
This setup proxies all requests starting with `/api` from `http://localhost:5173` to your API server running at `http://localhost:3002`.
## Configure a Server
Now that the app is set up, [Select an API Server](../server/)
# Framework - Angular
# Angular
## Create an Angular Project
To set up a new Angular project, use the Angular CLI:
```sh
ng new remult-angular
cd remult-angular
```
## Install Remult
Install the latest version of Remult in your Angular project:
```bash
npm install remult@latest
```
## Proxy API Requests from Angular DevServer to the API Server
In development, your Angular app will be served from `http://localhost:4200`, while the API server will run on `http://localhost:3002`. To allow the Angular app to communicate with the API server during development, you can use Angular's [proxy](https://angular.io/guide/build#proxying-to-a-backend-server) feature.
1. Create a file named `proxy.conf.json` in the root folder of your project with the following content:
```json
// proxy.conf.json
{
"/api": {
"target": "http://localhost:3002",
"secure": false
}
}
```
This configuration redirects all API calls from the Angular dev server to the API server running at `http://localhost:3002`.
## Adjust the `package.json`
Modify the `package.json` to use the newly created proxy configuration when serving the Angular app:
```json
// package.json
"dev": "ng serve --proxy-config proxy.conf.json --open",
```
Running the `dev` script will start the Angular dev server with the proxy configuration enabled.
## Configure a Server
Now that the app is set up, [Select an API Server](../server/)
# Framework - Vue
# Vue
## Create a Vue Project with Vite
To set up a new Vue project using Vite, run the following commands:
```sh
npm init -y vue@latest
cd remult-vue-project
```
## Install Remult
Install the latest version of Remult:
```bash
npm install remult@latest
```
## Enable TypeScript Decorators in Vite
To enable the use of decorators in your React app, modify the `vite.config.ts` file by adding the following to the `defineConfig` section:
```ts{6-12}
// vite.config.ts
// ...
export default defineConfig({
plugins: [vue()],
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
});
```
This configuration ensures that TypeScript decorators are enabled for the project.
## Proxy API Requests from Vite DevServer to the API Server
In development, your React app will be served from `http://localhost:5173`, while the API server will run on `http://localhost:3002`. To allow the React app to communicate with the API server during development, use Vite's [proxy](https://vitejs.dev/config/#server-proxy) feature.
Add the following proxy configuration to the `vite.config.ts` file:
```ts{6}
// vite.config.ts
//...
export default defineConfig({
plugins: [vue()],
server: { proxy: { "/api": "http://localhost:3002" } },
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
});
```
This setup proxies all requests starting with `/api` from `http://localhost:5173` to your API server running at `http://localhost:3002`.
## Configure a Server
Now that the app is set up, [Select an API Server](../server/)
# Framework - Next.js
# Next.js
## Create a Next.js Project
To create a new Next.js project, run the following command:
```sh
npx -y create-next-app@latest remult-nextjs
```
When prompted, use these answers:
```sh
✔ Would you like to use TypeScript? ... Yes
✔ Would you like to use ESLint? ... No
✔ Would you like to use Tailwind CSS? ... No
✔ Would you like to use `src/` directory? ... Yes
✔ Would you like to use App Router? (recommended) ... Yes
✔ Would you like to customize the default import alias? ... No
```
Afterward, navigate into the newly created project folder:
```sh
cd remult-nextjs
```
## Install Remult
Install the latest version of Remult:
```bash
npm install remult
```
## Bootstrap Remult in the Backend
Remult is bootstrapped in a Next.js app by creating a [catch-all dynamic API route](https://nextjs.org/docs/app/building-your-application/routing/dynamic-routes#catch-all-segments). This route will pass API requests to an object created using the `remultNextApp` function.
1. **Create an API file**
In the `src/` directory, create a file called `api.ts` with the following code to set up Remult:
```ts
// src/api.ts
import { remultNextApp } from 'remult/remult-next'
export const api = remultNextApp({})
```
2. **Create the API Route**
In the `src/app/api` directory, create a `[...remult]` subdirectory. Inside that directory, create a `route.ts` file with the following code:
```ts
// src/app/api/[...remult]/route.ts
import { api } from '../../../api'
export const { POST, PUT, DELETE, GET } = api
```
This file serves as a catch-all route for the Next.js API, handling all API requests by routing them through Remult.
## Enable TypeScript Decorators
To enable the use of decorators in your Next.js app, modify the `tsconfig.json` file. Add the following entry under the `compilerOptions` section:
```json{7}
// tsconfig.json
{
...
"compilerOptions": {
...
"experimentalDecorators": true // add this line
...
}
}
```
## Run the App
To start the development server, open a terminal and run the following command:
```sh
npm run dev
```
Your Next.js app is now running with Remult integrated and listening for API requests.
# Framework - Sveltekit
# SvelteKit
## Create a SvelteKit Project
To create a new SvelteKit project, run the following command:
```sh
npx sv@latest create remult-sveltekit-todo
```
During the setup, answer the prompts as follows:
1. **Which Svelte app template?**: ... `minimal` Project
2. **Add type checking with TypeScript?** ... Yes, using `TypeScript` syntax
3. **Select additional options**: ... We didn't select anything for this tutorial. Feel free to adapt it to your needs.
4. **Which package manager?**: ... We took `npm`, if you perfer others, feel free.
Once the setup is complete, navigate into the project directory:
```sh
cd remult-sveltekit-todo
```
## Install Required Packages and Remult
Install Remult and any necessary dependencies by running:
```sh
npm install remult --save-dev
```
## Bootstrap Remult
To set up Remult in your SvelteKit project:
1. Create your remult `api`
::: code-group
```ts [src/server/api.ts]
import { remultSveltekit } from 'remult/remult-sveltekit'
export const api = remultSveltekit({})
```
:::
2. Create a remult `api route`
::: code-group
```ts [src/routes/api/[...remult]/+server.ts]
import { api } from '../../../server/api'
export const { GET, POST, PUT, DELETE } = api
```
:::
## Final Tweaks
Remult uses TypeScript decorators to enhance classes into entities. To enable decorators in your SvelteKit project, modify the `tsconfig.json` file by adding the following to the `compilerOptions` section:
```json [tsconfig.json]
{
"compilerOptions": {
"experimentalDecorators": true // [!code ++]
}
}
```
## Run the App
To start the development server, run the following command:
```sh
npm run dev
```
Your SvelteKit app will be available at [http://localhost:5173](http://localhost:5173).
Your SvelteKit project with Remult is now up and running.
# Extra
## Extra - Remult in other SvelteKit routes
To enable remult across all sveltekit route
::: code-group
```ts [src/hooks.server.ts]
import { sequence } from '@sveltejs/kit/hooks'
import { api as handleRemult } from './server/api'
export const handle = sequence(
// Manage your sequence of handlers here
handleRemult,
)
```
:::
## Extra - Universal load & SSR
To Use remult in ssr `PageLoad` - this will leverage the `event`'s fetch to load data on the server
without reloading it on the frontend, and abiding to all api rules even when it runs on the server
::: code-group
```ts [src/routes/+page.ts]
import { remult } from 'remult'
import type { PageLoad } from './$types'
export const load = (async (event) => {
// Instruct remult to use the special svelte fetch
// Like this univeral load will work in SSR & CSR
remult.useFetch(event.fetch)
return repo(Task).find()
}) satisfies PageLoad
```
:::
::: tip
You can add this in `+layout.ts` as well and all routes **under** will have the correct fetch out of the box.
:::
## Extra - Server load
If you return a remult entity from the `load` function of a `+page.server.ts`,
SvelteKit will complain and show this error:
```bash
Error: Data returned from `load` while rendering / is not serializable:
Cannot stringify arbitrary non-POJOs (data.tasks[0])
```
To fix this, you can use `repo(Entity).toJson()` in the server load function and `repo(Entity).fromJson()` in the .svelte file
to serialize and deserialize well the entity.
::: code-group
```ts [src/routes/+page.server.ts]
import { repo } from 'remult'
import type { PageServerLoad } from './$types'
import { Task } from '../demo/todo/Task'
export const load = (async () => {
const tasks = repo(Task).toJson(await repo(Task).find())
return {
tasks,
}
}) satisfies PageServerLoad
```
```svelte [src/routes/+page.svelte]
```
:::
---
#### Since `@sveltejs/kit@2.11.0`, there is a new feature: [Universal-hooks-transport](https://svelte.dev/docs/kit/hooks#Universal-hooks-transport)
With this new feature, you can get rid of `repo(Entity).toJson()` and `repo(Entity).fromJson()` thanks to this file: `hooks.ts`.
::: code-group
```ts [src/hooks.ts]
import { repo, type ClassType } from 'remult'
import { Task } from './demo/todo/Task'
import type { Transport } from '@sveltejs/kit'
import { api } from './server/api'
// You can have:
// A/ a local entity array to work only these ones (like here)
// or
// B/ import a global entity array that will be
// shared between backend and frontend (not in ./server/api.ts)
const entities = [Task]
export const transport: Transport = {
remultTransport: {
encode: (value: any) => {
for (let index = 0; index < entities.length; index++) {
const element = entities[index] as ClassType
if (value instanceof element) {
return {
...repo(element).toJson(value),
entity_key: repo(element).metadata.key,
}
}
}
},
decode: (value: any) => {
for (let index = 0; index < entities.length; index++) {
const element = entities[index] as ClassType
if (value.entity_key === repo(element).metadata.key) {
return repo(element).fromJson(value)
}
}
},
},
}
```
```ts [src/routes/+page.server.ts]
import { repo } from 'remult'
import type { PageServerLoad } from './$types'
import { Task } from '../demo/todo/Task'
export const load = (async () => {
// const tasks = repo(Task).toJson(await repo(Task).find()) // [!code --]
const tasks = await repo(Task).find()
return {
tasks,
}
}) satisfies PageServerLoad
```
```svelte [src/routes/+page.svelte]
```
:::
## Extra - Svelte 5 & Reactivity
Remult is fully compatible with Svelte 5, Rune, and Reactivity.
To take full advantage of it, add this snippet:
::: code-group
```html [src/routes/+layout.svelte]
```
:::
Then you can use `$state`, `$derived` like any other places
::: code-group
```html [src/routes/+page.svelte]
```
:::
# Framework - Nuxt
### Create a Nuxt Project
To create a new Nuxt project, run the following command:
```sh
npx nuxi init remult-nuxt-todo
cd remult-nuxt-todo
```
### Install Remult
Install Remult in your Nuxt project by running the following command:
```sh
npm install remult
```
### Enable TypeScript Decorators
To enable the use of TypeScript decorators in your Nuxt project, modify the `nuxt.config.ts` file as follows:
```ts [nuxt.config.ts]
// https://nuxt.com/docs/api/configuration/nuxt-config
export default defineNuxtConfig({
compatibilityDate: '2024-04-03',
devtools: { enabled: true },
nitro: {
esbuild: {
options: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
},
},
vite: {
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
},
})
```
### Bootstrap Remult
1. **Create the API File**
In the `server/api/` directory, create a dynamic API route that integrates Remult with Nuxt. The following code sets up the API and defines the entities to be used:
```ts [server/api/[...remult].ts]
import { remultNuxt } from 'remult/remult-nuxt'
import { Task } from '../../demo/todo/Task.js'
export const api = remultNuxt({
admin: true,
entities: [Task],
})
export default defineEventHandler(api)
```
This setup uses the Remult `Task` entity and registers the API routes dynamically for the entities within the app.
### Run the App
To start the development server, run:
```sh
npm run dev
```
The Nuxt app will now be running on the default address [http://localhost:3000](http://localhost:3000).
### Setup Completed
Your Nuxt app with Remult is now set up and ready to go. You can now move on to defining your entities and building your task list app.
# Framework - SolidStart
# SolidStart
### Step 1: Create a New SolidStart Project
Run the following command to initialize a new SolidStart project:
```sh
npm init solid@latest remult-solid-start
```
Answer the prompts as follows:
```sh
o Is this a Solid-Start project? Yes
o Which template would you like to use? basic
o Use TypeScript? Yes
```
Once completed, navigate to the project directory:
```sh
cd remult-solid-start
```
### Step 2: Install Remult
To install the Remult package, run:
```sh
npm i remult
```
### Step 3: Bootstrap Remult in the Backend
Remult is integrated into `SolidStart` using a [catch-all dynamic API route](https://start.solidjs.com/core-concepts/routing#catch-all-routes), which passes API requests to a handler created using the `remultSolidStart` function.
1. **Create the Remult API Configuration File**
In the `src` directory, create a file named `api.ts` with the following code:
```ts
// src/api.ts
import { remultSolidStart } from 'remult/remult-solid-start'
export const api = remultSolidStart({})
```
2. **Set Up the Catch-All API Route**
In the `src/routes/api/` directory, create a file named `[...remult].ts` with the following code:
```ts
// src/routes/api/[...remult].ts
import { api } from '../../api.js'
export const { POST, PUT, DELETE, GET } = api
```
### Step 4: Enable TypeScript Decorators
1. **Install Babel Plugins for Decorators**:
```sh
npm i -D @babel/plugin-proposal-decorators @babel/plugin-transform-class-properties
```
2. **Configure Babel Plugins in SolidStart**:
Add the following configuration to the `app.config.ts` file to enable TypeScript decorators:
```ts{6-14}
// app.config.ts
import { defineConfig } from "@solidjs/start/config"
export default defineConfig({
//@ts-ignore
solid: {
babel: {
plugins: [
["@babel/plugin-proposal-decorators", { version: "legacy" }],
["@babel/plugin-transform-class-properties"],
],
},
},
})
```
### Setup Complete
Your SolidStart project is now set up with Remult and ready to run. You can now proceed to the next steps of building your application.
# Stacks - Server
# Select a server
# Server - Express
# Express
### Install Required Packages
To set up your Express server with Remult, run the following commands to install the necessary packages:
```sh
npm install express remult
npm install --save-dev @types/express tsx
```
### Bootstrap Remult in the Backend
Remult is integrated into your backend as an `Express middleware`.
1. **Create the API File**
Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware:
```ts
// src/server/api.ts
import { remultExpress } from 'remult/remult-express'
export const api = remultExpress()
```
2. **Register the Middleware**
Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following lines:
```ts{4,7}
// src/server/index.ts
import express from "express"
import { api } from "./api.js"
const app = express()
app.use(api)
app.listen(3002, () => console.log("Server started"))
```
::: warning ESM Configuration
In this tutorial, we are using ECMAScript modules (`esm`) for the Node.js server. This means that when importing files, you must include the `.js` suffix (as shown in the `import { api } from "./api.js"` statement).
Additionally, make sure to set `"type": "module"` in your `package.json` file.
:::
#### Create the Server's TypeScript Configuration
In the root folder, create a TypeScript configuration file named `tsconfig.server.json` to manage the server's settings:
```json
{
"compilerOptions": {
"experimentalDecorators": true,
"skipLibCheck": true,
"esModuleInterop": true,
"outDir": "dist",
"rootDir": "src",
"module": "nodenext"
},
"include": ["src/server/**/*", "src/shared/**/*"]
}
```
This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code.
#### Create an `npm` Script to Start the API Server
To simplify the development process, add a new script in your `package.json` file to start the Express server in development mode:
```json
// package.json
"dev-node": "tsx watch --env-file=.env --tsconfig tsconfig.server.json src/server"
```
- `tsx`: A TypeScript Node.js execution environment that watches for file changes and automatically restarts the server on each save.
- `--env-file=.env`: Ensures environment variables are loaded from the `.env` file.
- `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server.
#### Start the Node Server
Finally, open a new terminal and run the following command to start the development server:
```sh
npm run dev-node
```
The server will now run on port 3002. `tsx` will watch for any file changes, automatically restarting the server whenever updates are made.
# Server - Fastify
# Fastify
### Install Required Packages
To set up your Fastify server with Remult, run the following commands to install the necessary packages:
```sh
npm install fastify remult
npm install --save-dev tsx
```
### Bootstrap Remult in the Backend
Remult is integrated into your backend as Fastify middleware.
1. **Create the API File**
Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware for Fastify:
```ts
// src/server/api.ts
import { remultFastify } from 'remult/remult-fastify'
export const api = remultFastify()
```
2. **Register the Middleware**
Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following lines:
```ts{5,9}
// src/server/index.ts
import fastify from "fastify"
import { api } from "./api.js"
const app = Fastify();
app.register(api);
app.listen({ port: 3002 }, () => console.log("Server started"))
```
::: warning ESM Configuration
Similar to the Express setup, when using ECMAScript modules (`esm`) in Fastify, you must include the `.js` suffix when importing files (as shown in the `import { api } from "./api.js"` statement).
Also, ensure that `"type": "module"` is set in your `package.json`.
:::
#### Create the Server's TypeScript Configuration
In the root folder, create a TypeScript configuration file named `tsconfig.server.json` for the server project:
```json
{
"compilerOptions": {
"experimentalDecorators": true,
"skipLibCheck": true,
"esModuleInterop": true,
"outDir": "dist",
"rootDir": "src",
"module": "nodenext"
},
"include": ["src/server/**/*", "src/shared/**/*"]
}
```
This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code.
#### Create an `npm` Script to Start the API Server
To simplify the development process, add a new script in your `package.json` to start the Fastify server in development mode:
```json
// package.json
"dev-node": "tsx watch --env-file=.env --tsconfig tsconfig.server.json src/server"
```
- `tsx`: A TypeScript Node.js execution environment that watches for file changes and automatically restarts the server on each save.
- `--env-file=.env`: Ensures environment variables are loaded from the `.env` file.
- `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server.
#### Start the Fastify Server
Open a new terminal and run the following command to start the development server:
```sh
npm run dev-node
```
The server will now run on port 3002. `tsx` will watch for any file changes, automatically restarting the Fastify server whenever updates are made.
# Server - Hono
# Hono
### Install Required Packages
To set up your Hono server with Remult, install the necessary packages:
```sh
npm install hono remult
npm install --save-dev tsx
```
### Bootstrap Remult in the Backend
Remult is integrated into your backend using the `remultHono` adapter for Hono.
1. **Create the API File**
Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware for Hono:
```ts
// src/server/api.ts
import { remultHono } from 'remult/remult-hono'
export const api = remultHono()
```
2. **Register the Middleware**
Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following code:
```ts{5,7-8}
// src/server/index.ts
import { Hono } from 'hono'
import { serve } from '@hono/node-server'
import { api } from './api.js'
const app = new Hono()
app.route('', api)
serve(app,{ port:3002 })
```
::: warning ESM Configuration
When using ECMAScript modules (`esm`) in Hono, ensure you include the `.js` suffix when importing files, as shown in the `import { api } from './api.js'` statement.
Also, make sure that `"type": "module"` is set in your `package.json`.
:::
#### Create the Server's TypeScript Configuration
In the root folder, create a TypeScript configuration file named `tsconfig.server.json` for the Hono server:
```json
{
"compilerOptions": {
"experimentalDecorators": true,
"skipLibCheck": true,
"esModuleInterop": true,
"outDir": "dist",
"rootDir": "src",
"module": "nodenext"
},
"include": ["src/server/**/*", "src/shared/**/*"]
}
```
This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code.
#### Create an `npm` Script to Start the API Server
Add a new script in your `package.json` to start the Hono server in development mode:
```json
// package.json
"dev-node": "tsx watch --env-file=.env --tsconfig tsconfig.server.json src/server"
```
- `tsx`: A TypeScript execution environment that watches for file changes and automatically restarts the server on each save.
- `--env-file=.env`: Ensures environment variables are loaded from the `.env` file.
- `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server.
#### Start the Hono Server
Open a new terminal and run the following command to start the development server:
```sh
npm run dev-node
```
The server will now run on port 3002. `tsx` will watch for file changes, automatically restarting the Hono server whenever updates are made.
# Server - Hapi
# Hapi
### Install Required Packages
To set up your Hapi server with Remult, install the necessary packages:
```sh
npm install @hapi/hapi remult
npm install --save-dev tsx
```
### Bootstrap Remult in the Backend
Remult is integrated into your backend as a Hapi plugin.
1. **Create the API File**
Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware for Hapi:
```ts
// src/server/api.ts
import { remultHapi } from 'remult/remult-hapi'
export const api = remultHapi()
```
2. **Register the Middleware**
Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following code:
```ts{5-7,10}
// src/server/index.ts
import { server } from '@hapi/hapi'
import { api } from './api.js'
const hapi = server({ port: 3002 })
await hapi.register(api)
hapi.start().then(() => console.log("Server started"))
```
::: warning ESM Configuration
When using ECMAScript modules (`esm`) in Hapi, ensure you include the `.js` suffix when importing files, as shown in the `import { api } from './api.js'` statement.
Also, make sure that `"type": "module"` is set in your `package.json`.
:::
#### Create the Server's TypeScript Configuration
In the root folder, create a TypeScript configuration file named `tsconfig.server.json` for the Hapi server:
```json
{
"compilerOptions": {
"experimentalDecorators": true,
"skipLibCheck": true,
"esModuleInterop": true,
"outDir": "dist",
"rootDir": "src",
"module": "nodenext"
},
"include": ["src/server/**/*", "src/shared/**/*"]
}
```
This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code.
#### Create an `npm` Script to Start the API Server
Add a new script in your `package.json` to start the Hapi server in development mode:
```json
// package.json
"dev-node": "tsx watch --env-file=.env --tsconfig tsconfig.server.json src/server"
```
- `tsx`: A TypeScript execution environment that watches for file changes and automatically restarts the server on each save.
- `--env-file=.env`: Ensures environment variables are loaded from the `.env` file.
- `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server.
#### Start the Hapi Server
Open a new terminal and run the following command to start the development server:
```sh
npm run dev-node
```
The server will now run on port 3002. `tsx` will watch for file changes, automatically restarting the Hapi server whenever updates are made.
# Server - Koa
# Koa
### Install Required Packages
To set up your Koa server with Remult, run the following commands to install the necessary packages:
```sh
npm install koa koa-bodyparser remult
npm install --save-dev @types/koa @types/koa-bodyparser tsx
```
### Bootstrap Remult in the Backend
Remult is integrated into your backend as middleware for Koa.
1. **Create the API File**
Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware:
```ts title="src/server/api.ts"
// src/server/api.ts
import { createRemultServer } from 'remult/server'
export const api = createRemultServer()
```
2. **Register the Middleware**
Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following lines:
```ts title="src/server/index.ts" add={3,5,9}
// src/server/index.ts
import * as koa from 'koa'
import * as bodyParser from 'koa-bodyparser'
import { api } from './api.js'
const app = new koa()
app.use(bodyParser()) // Enables JSON body parsing for API requests
app.use(async (ctx, next) => {
const r = await api.handle(ctx.request) // Handle API requests with Remult
if (r) {
ctx.response.body = r.data
ctx.response.status = r.statusCode
} else {
await next() // If not handled by Remult, pass on to the next middleware
}
})
app.listen(3002, () => {
console.log('Server started on port 3002')
})
```
::: warning ESM Configuration
In this tutorial, we are using ECMAScript modules (`esm`) for the Node.js server. When importing files, you must include the `.js` suffix (as shown in the `import { api } from "./api.js"` statement).
Additionally, make sure to set `"type": "module"` in your `package.json` file.
:::
#### Create the Server's TypeScript Configuration
In the root folder, create a TypeScript configuration file named `tsconfig.server.json` to manage the server's settings:
```json
{
"compilerOptions": {
"experimentalDecorators": true,
"skipLibCheck": true,
"esModuleInterop": true,
"outDir": "dist",
"rootDir": "src",
"module": "nodenext"
},
"include": ["src/server/**/*", "src/shared/**/*"]
}
```
This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code.
#### Create an `npm` Script to Start the API Server
To simplify the development process, add a new script in your `package.json` file to start the Koa server in development mode:
```json
// package.json
"dev-node": "tsx watch --env-file=.env --tsconfig tsconfig.server.json src/server"
```
- `tsx`: A TypeScript Node.js execution environment that watches for file changes and automatically restarts the server on each save.
- `--env-file=.env`: Ensures environment variables are loaded from the `.env` file.
- `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server.
#### Start the Koa Server
Finally, open a new terminal and run the following command to start the development server:
```sh
npm run dev-node
```
The server will now run on port 3002. `tsx` will watch for any file changes, automatically restarting the server whenever updates are made.
# Server - nest
# Nest.js
### Bootstrap Remult in the Nest.js back-end
1. Create a `main.ts` file in the `src/` folder with the following code:
```ts title="src/main.ts"
// src/main.ts
import { NestFactory } from '@nestjs/core'
import { AppModule } from './app.module'
import { remultExpress } from 'remult/remult-express'
async function bootstrap() {
const app = await NestFactory.create(AppModule)
app.use(remultExpress()) // Integrate Remult as middleware
await app.listen(3002) // Start server on port 3002
}
bootstrap()
```
2. Add a simple `AppModule` in `src/app.module.ts`:
```ts title="src/app.module.ts"
// src/app.module.ts
import { Module } from '@nestjs/common'
@Module({})
export class AppModule {}
```
### Run the Nest.js server
Run the server with:
```sh
npm run start
```
Your Nest.js app with Remult is now up and running on port `3002`.
# Stacks - Database
# Choose a Database
By default, if no database provider is specified, Remult will use a simple JSON file-based database. This will store your data in JSON files located in the `db` folder at the root of your project.
# Database - Json files
## JSON Files
You can store data in JSON files using Remult. Here's how to configure your server:
### Step 1: Configure the `dataProvider`
In your `index.ts` (or server file), configure the `dataProvider` to use JSON files as the storage mechanism:
```ts{5-6,12-14}
// index.ts
import express from "express"
import { remultExpress } from "remult/remult-express"
import { JsonDataProvider } from "remult"
import { JsonEntityFileStorage } from "remult/server"
const app = express()
app.use(
remultExpress({
dataProvider: async () =>
new JsonDataProvider(new JsonEntityFileStorage("./db")) // Data will be stored in the 'db' folder
})
)
```
### Explanation:
- **`JsonDataProvider`**: This is the data provider that will store your data in JSON format.
- **`JsonEntityFileStorage`**: Specifies the directory where the JSON files will be stored (in this case, `./db`).
- **`"./db"`**: The path where JSON files for entities will be created. Ensure the folder exists or it will be created automatically.
This configuration allows you to store and manage your application data in JSON files, ideal for small projects or quick setups.
# Database - PostgreSQL
# PostgreSQL
To set up PostgreSQL as the database provider for your Remult application, you'll need to configure the `dataProvider` property in the `api.ts` file.
### Step 1: Install the `node-postgres` package
Run the following command to install the necessary PostgreSQL client for Node.js:
```sh
npm i pg
```
### Step 2: Set the `dataProvider` Property
In the `api.ts` file, configure the `dataProvider` property to connect to your PostgreSQL database:
```ts{3,7,11-15}
import express from "express"
import { remultExpress } from "remult/remult-express"
import { createPostgresDataProvider } from "remult/postgres"
const app = express()
const connectionString = "postgres://user:password@host:5432/database"
app.use(
remultExpress({
dataProvider: createPostgresDataProvider({
connectionString, // default: process.env["DATABASE_URL"]
// configuration: {} // optional: a `pg.PoolConfig` object or "heroku"
})
})
)
```
### Alternative: Use an Existing PostgreSQL Connection
If you already have a PostgreSQL connection set up, you can pass it directly to Remult:
```ts
import { Pool } from 'pg'
import { SqlDatabase } from 'remult'
import { PostgresDataProvider } from 'remult/postgres'
import { remultExpress } from 'remult/remult-express'
const pg = new Pool({
connectionString: 'your-connection-string-here',
})
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase(new PostgresDataProvider(pg)),
}),
)
```
In this example, the `pg.Pool` is used to create the PostgreSQL connection, and `SqlDatabase` is used to interface with the `PostgresDataProvider`.
# Database - MySQL
# MySQL
### Step 1: Install `knex` and `mysql2`
Run the following command to install the required packages:
```sh
npm i knex mysql2
```
### Step 2: Set the `dataProvider` Property
In your `api.ts` file, configure the `dataProvider` to connect to your MySQL database using `Knex`:
```ts{3,9-18}
import express from "express"
import { remultExpress } from "remult/remult-express"
import { createKnexDataProvider } from "remult/remult-knex"
const app = express()
app.use(
remultExpress({
dataProvider: createKnexDataProvider({
client: "mysql2", // Specify the MySQL client
connection: {
user: "your_database_user",
password: "your_database_password",
host: "127.0.0.1",
database: "test",
},
}),
})
)
```
### Alternative: Use an Existing Knex Provider
If you're already using a `knex` instance in your project, you can pass it directly to Remult:
```ts
import express from 'express'
import { KnexDataProvider } from 'remult/remult-knex'
import { remultExpress } from 'remult/remult-express'
import knex from 'knex'
const knexDb = knex({
client: 'mysql2',
connection: {
user: 'your_database_user',
password: 'your_database_password',
host: '127.0.0.1',
database: 'test',
},
})
const app = express()
app.use(
remultExpress({
dataProvider: new KnexDataProvider(knexDb), // Use the existing knex instance
}),
)
```
# Database - MongoDB
## MongoDB
To use MongoDB as the database provider for your Remult application, follow the steps below.
### Step 1: Install MongoDB Driver
Run the following command to install the `mongodb` package:
```sh
npm i mongodb
```
### Step 2: Set the `dataProvider` Property
In your `api.ts` or server file, configure the `dataProvider` to connect to your MongoDB database:
```ts{3-4,10-14}
import express from "express"
import { remultExpress } from "remult/remult-express"
import { MongoClient } from "mongodb"
import { MongoDataProvider } from "remult/remult-mongo"
const app = express()
app.use(
remultExpress({
dataProvider: async () => {
const client = new MongoClient("mongodb://localhost:27017/local")
await client.connect()
return new MongoDataProvider(client.db("test"), client)
}
})
)
```
This setup connects to a MongoDB instance running on `localhost` and uses the `test` database. The `MongoDataProvider` manages the connection, allowing Remult to interact with MongoDB seamlessly.
# Database - SQLite3
Here’s the polished version of the **sqlite3** setup:
### SQLite3 Setup
This version of **SQLite3** works well even on platforms like StackBlitz.
### Step 1: Install SQLite3
Run the following command to install the `sqlite3` package:
```sh
npm i sqlite3
```
### Step 2: Configure the `dataProvider`
In your `api.ts` or server file, configure the `dataProvider` to connect to the SQLite database using **sqlite3**:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult'
import sqlite3 from 'sqlite3'
import { Sqlite3DataProvider } from 'remult/remult-sqlite3'
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase(
new Sqlite3DataProvider(new sqlite3.Database('./mydb.sqlite')),
),
}),
)
```
This configuration connects to an SQLite database stored in the `mydb.sqlite` file. The `Sqlite3DataProvider` is wrapped inside the `SqlDatabase` class, enabling Remult to work with SQLite databases smoothly across different environments, including StackBlitz.
# Database - Better SQLite3
Here’s the polished version of the **Better-sqlite3** setup:
### Better-sqlite3
To use **Better-sqlite3** as the database provider for your Remult application, follow these steps:
### Step 1: Install Better-sqlite3
Run the following command to install the `better-sqlite3` package:
```sh
npm i better-sqlite3
```
### Step 2: Configure the `dataProvider`
In your `api.ts` or server file, configure the `dataProvider` to connect to the SQLite database using **Better-sqlite3**:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult'
import Database from 'better-sqlite3'
import { BetterSqlite3DataProvider } from 'remult/remult-better-sqlite3'
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase(
new BetterSqlite3DataProvider(new Database('./mydb.sqlite')),
),
}),
)
```
This setup connects to an SQLite database stored in the `mydb.sqlite` file. The `BetterSqlite3DataProvider` is wrapped inside the `SqlDatabase` class to allow Remult to interact with SQLite efficiently.
# Database - MSSQL
# Microsoft SQL Server
### Step 1: Install Required Packages
Install `knex` and `tedious` to enable Microsoft SQL Server integration.
```sh
npm i knex tedious
```
### Step 2: Configure the `dataProvider`
In your `index.ts` (or server file), configure the `dataProvider` to use Microsoft SQL Server with the following `knex` client configuration:
```ts{5,11-25}
// index.ts
import express from "express"
import { remultExpress } from "remult/remult-express"
import { createKnexDataProvider } from "remult/remult-knex"
const app = express()
app.use(
remultExpress({
dataProvider: createKnexDataProvider({
// Knex client configuration for MSSQL
client: "mssql",
connection: {
server: "127.0.0.1", // SQL Server address
database: "test", // Your database name
user: "your_database_user", // SQL Server user
password: "your_database_password", // Password for the SQL Server user
options: {
enableArithAbort: true, // Required option for newer versions of MSSQL
encrypt: false, // Set to true if using Azure
instanceName: "sqlexpress", // Optional: Define the SQL Server instance name
},
},
}),
})
)
```
### Step 3: Use an Existing `knex` Provider (Optional)
If you have an existing `knex` instance, you can easily integrate it with Remult like this:
```ts
import express from 'express'
import { KnexDataProvider } from 'remult/remult-knex'
import { remultExpress } from 'remult/remult-express'
import knex from 'knex'
const knexDb = knex({
client: 'mssql', // Specify MSSQL as the client
connection: {
// Add your MSSQL connection details here
server: '127.0.0.1',
user: 'your_database_user',
password: 'your_database_password',
database: 'test',
},
})
const app = express()
app.use(
remultExpress({
dataProvider: new KnexDataProvider(knexDb), // Use your existing knex instance
}),
)
```
### Explanation:
- **`tedious`**: The underlying driver used by `knex` to connect to SQL Server.
- **`client: "mssql"`**: Specifies that we are using Microsoft SQL Server.
- **`createKnexDataProvider`**: Allows you to use `knex` to connect to SQL Server as the data provider for Remult.
- **`options`**: The additional configuration for SQL Server, including `enableArithAbort` and `encrypt`.
This setup lets you easily connect Remult to Microsoft SQL Server using `knex` for query building and `tedious` as the driver.
# Database - Bun SQLite
### Bun:SQLite
### Step 1: Configure the `dataProvider`
In your `api.ts` or server file, configure the `dataProvider` to use `bun:sqlite` as follows:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult'
import { Database } from 'bun:sqlite'
import { BunSqliteDataProvider } from 'remult/remult-bun-sqlite'
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase(
new BunSqliteDataProvider(new Database('./mydb.sqlite')),
),
}),
)
```
### Explanation:
- **bun:sqlite**: This uses Bun's native SQLite database, `bun:sqlite`, to manage SQLite databases efficiently in a Bun-based environment.
- **BunSqliteDataProvider**: The `BunSqliteDataProvider` integrates the Bun SQLite database as a data provider for Remult.
- **SqlDatabase**: Wraps the `BunSqliteDataProvider` to make it compatible with Remult's SQL-based data provider system.
This setup allows you to use Bun's SQLite implementation as the database provider for your Remult application, leveraging Bun’s performance benefits with SQLite.
# Database - sqljs
### sql.js
### Step 1: Install sql.js
Run the following command to install the `sql.js` package:
```sh
npm i sql.js
```
### Step 2: Configure the `dataProvider`
In your `api.ts` or server file, configure the `dataProvider` to use `sql.js`:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult'
import initSqlJs from 'sql.js'
import { SqlJsDataProvider } from 'remult/remult-sql-js'
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase(
new SqlJsDataProvider(initSqlJs().then((SQL) => new SQL.Database())),
),
}),
)
```
### Explanation:
- **sql.js**: This setup initializes an in-memory SQLite database using `sql.js`, a library that runs SQLite in the browser or in Node.js.
- **SqlJsDataProvider**: The `SqlJsDataProvider` is used to integrate the `sql.js` database as a Remult data provider.
- **Async Initialization**: The `initSqlJs()` function initializes the SQL.js engine and sets up the database instance.
This configuration allows you to use an in-memory SQLite database in your Remult application, powered by `sql.js`.
# Database - Turso
Here’s the polished version of the **Turso** setup:
### Turso Setup
### Step 1: Install Turso Client
Run the following command to install the `@libsql/client` package:
```sh
npm install @libsql/client
```
### Step 2: Configure the `dataProvider`
In your `api.ts` or server file, configure the `dataProvider` to connect to Turso using the Turso client:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult'
import { createClient } from '@libsql/client'
import { TursoDataProvider } from 'remult/remult-turso'
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase(
new TursoDataProvider(
createClient({
url: process.env.TURSO_DATABASE_URL,
authToken: process.env.TURSO_AUTH_TOKEN,
}),
),
),
}),
)
```
### Explanation:
- **Turso Client**: This configuration uses the `@libsql/client` package to connect to the Turso database.
- **Environment Variables**: Ensure you have `TURSO_DATABASE_URL` and `TURSO_AUTH_TOKEN` defined in your environment to securely pass the database connection URL and authentication token.
- **SqlDatabase**: The `TursoDataProvider` is wrapped with the `SqlDatabase` class, allowing seamless integration of Turso as a Remult data provider.
This setup allows you to use Turso as the backend database for your application.
# Database - DuckDb
## DuckDB
To use DuckDB as the database provider in your Remult-based application, follow these steps:
### Step 1: Install DuckDB
Run the following command to install `duckdb`:
```sh
npm i duckdb
```
### Step 2: Configure the `dataProvider`
In your `index.ts` (or server file), configure the `dataProvider` to use DuckDB:
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
import { SqlDatabase } from 'remult' // [!code highlight]
import { Database } from 'duckdb' // [!code highlight]
import { DuckDBDataProvider } from 'remult/remult-duckdb' // [!code highlight]
const app = express()
app.use(
remultExpress({
dataProvider: new SqlDatabase( // [!code highlight]
new DuckDBDataProvider(new Database(':memory:')), // [!code highlight]
), // [!code highlight]
}),
)
app.listen(3000, () => console.log('Server is running on port 3000'))
```
### Explanation:
- **DuckDB setup**: The database is initialized with `new Database(':memory:')` to create an in-memory database. Replace `':memory:'` with a file path if you want to persist the database to disk.
- **SqlDatabase**: `SqlDatabase` is used to connect Remult with DuckDB through the `DuckDBDataProvider`.
This setup allows you to use DuckDB as your database provider in a Remult project.
# Database - Oracle
## Oracle Database
To use an Oracle database as the data provider for your Remult-based application, follow these steps:
### Step 1: Install Required Packages
Install `knex` and `oracledb`:
```sh
npm i knex oracledb
```
### Step 2: Configure the `dataProvider`
In your `index.ts` (or server file), configure the `dataProvider` to use Oracle through `knex`:
```ts{5,11-19}
// index.ts
import express from "express"
import { remultExpress } from "remult/remult-express"
import { createKnexDataProvider } from "remult/remult-knex"
const app = express()
app.use(
remultExpress({
dataProvider: createKnexDataProvider({
// Knex client configuration for Oracle
client: "oracledb",
connection: {
user: "your_database_user",
password: "your_database_password",
connectString: "SERVER" // Specify your Oracle server connection string
}
})
})
)
app.listen(3000, () => console.log("Server is running on port 3000"))
```
### Step 3: Using an Existing `knex` Provider
If you're already using a `knex` instance, you can easily plug it into Remult:
```ts
import express from 'express'
import { KnexDataProvider } from 'remult/remult-knex'
import { remultExpress } from 'remult/remult-express'
import knex from 'knex'
const knexDb = knex({
client: 'oracledb',
connection: {
user: 'your_database_user',
password: 'your_database_password',
connectString: 'SERVER',
},
})
const app = express()
app.use(
remultExpress({
dataProvider: new KnexDataProvider(knexDb), // Reuse your existing knex provider
}),
)
app.listen(3000, () => console.log('Server is running on port 3000'))
```
### Explanation:
- **Knex configuration**: `client: "oracledb"` indicates you're using Oracle, and `connection` contains the necessary credentials and connection string.
- **Existing knex provider**: If you already have a `knex` instance, it can be reused directly with Remult.
This setup integrates Oracle into your Remult-based application.
# Server-side Code - Backend Methods
# Backend Methods
Backend methods run on the backend and are used to improve performance, execute server-only code (e.g., sending emails), or perform operations not accessible through the API.
## Static Backend Methods
Static backend methods represent the most straightforward type, transmitting their parameters to the backend and delivering their outcome to the frontend.
1. **Define the Backend Method:**
```typescript
import { BackendMethod, remult } from 'remult'
import { Task } from './Task'
export class TasksController {
/**
* Sets the completion status of all tasks.
* @param {boolean} completed - The completion status to set for all tasks.
*/
@BackendMethod({ allowed: true })
static async setAll(completed: boolean) {
const taskRepo = remult.repo(Task)
for (const task of await taskRepo.find()) {
await taskRepo.save({ ...task, completed })
}
}
}
```
Each controller can house one or more backend methods, each serving distinct purposes tailored to your application's needs. In the provided example, the `TasksController` class contains a single backend method named `setAll`, responsible for setting the completion status of all tasks.
The method name, such as `setAll`, serves as the URL for the corresponding REST endpoint on the backend server. It's worth noting that you can configure a prefix for these endpoints using the `apiPrefix` option, providing flexibility in structuring your backend API routes.
The allowed: true parameter signifies that the backend method can be invoked by anyone. Alternatively, you can customize the authorization settings for finer control over who can access the method.
For instance, setting allow: Allow.authenticated restricts access to authenticated users only, ensuring that only logged-in users can utilize the method.
Similarly, specifying allow: 'admin' limits access to users with administrative privileges, granting access exclusively to administrators.
These options offer granular control over authorization, allowing you to tailor access permissions based on your application's specific requirements and security considerations.
2. **Register the Controller:**
```typescript
// Register TasksController in the controllers array of the remultExpress options
export const api = remultExpress({
entities: [Task],
controllers: [TasksController],
})
```
3. **Call from the Frontend:**
```typescript
await TasksController.setAll(true)
```
This example demonstrates how to define and use a static backend method, `setAll`, within the `TasksController` class. When called from the frontend, this method sets the completion status of all tasks to the specified value (`true` in this case). The method leverages Remult's `BackendMethod` decorator to handle the communication between the frontend and backend seamlessly.
# Server-side Code - Server-only Dependencies
# Backend only code
One of the main advantages of remult is that you write code once, and it runs both on the server and in the browser.
However, if you are using a library that only works on the server, the fact that the same code is bundled to the frontend can cause problems. For example, when you build an Angular project, you'll get `Module not found` errors.
This article will walk through such a scenario and how it can be solved.
For this example, our customer would like us to document each call to the `updatePriceOnBackend` method in a log file.
Our first instinct would be to add in the `products.controller.ts` file an import to `fs` (Node JS file system component) and write the following code:
```ts{1,10}
import * as fs from 'fs';
.....
@BackendMethod({allowed:true})
static async updatePriceOnBackend(priceToUpdate:number,remult?:Remult){
let products = await remult.repo(Products).find();
for (const p of products) {
p.price.value += priceToUpdate;
await p.save();
}
fs.appendFileSync('./logs/log.txt', new Date() + " " + remult.user.name + " update price\n");
}
```
::: danger Error
As soon as we do that, we'll get the following errors on the `ng-serve` terminal
```sh
ERROR in ./src/app/products/products.controller.ts
Module not found: Error: Can't resolve 'fs' in 'C:\try\test19\my-project\src\app\products'
i 「wdm」: Failed to compile.
```
:::
We get this error because the `fs` module on which we rely here is only relevant in the remult of a `Node JS` server and not in the context of the browser.
There are two ways to handle this:
## Solution 1 - exclude from bundler
::: tabs
== vite
### Exclude in `vite.config`
Instruct vite to exclude the `server-only` packages from the bundle
```ts
import { defineConfig } from 'vite'
import react from '@vitejs/plugin-react'
// https://vitejs.dev/config/
export default defineConfig({
plugins: [react()],
build: { // [!code ++]
rollupOptions: { // [!code ++]
external: ['fs', 'nodemailer', 'node-fetch'], // [!code ++]
}, // [!code ++]
}, // [!code ++]
optimizeDeps: { // [!code ++]
exclude: ['fs', 'nodemailer', 'node-fetch'], // [!code ++]
}, // [!code ++]
})
```
== Webpack and Angular version <=16
Instruct `webpack` not to include the `fs` package in the `frontend` bundle by adding the following JSON to the main section of the project's `package.json` file.
_package.json_
```json
"browser": {
"jsonwebtoken": false
}
```
- note that you'll need to restart the react/angular dev server.
== Angular 17
1. You'll need to either remove `types` entry in the `tsconfig.app.json` or add the types you need to that types array.
2. In `angular.json` you'll need to add an entry called `externalDependencies` to the `architect/build/options` key for your project
```json{21-23}
// angular.json
{
"$schema": "./node_modules/@angular/cli/lib/config/schema.json",
"version": 1,
"newProjectRoot": "projects",
"projects": {
"remult-angular-todo": {
"projectType": "application",
"schematics": {},
"root": "",
"sourceRoot": "src",
"prefix": "app",
"architect": {
"build": {
"builder": "@angular-devkit/build-angular:application",
"options": {
"outputPath": "dist/remult-angular-todo",
"index": "src/index.html",
"browser": "src/main.ts",
"externalDependencies": [
"fs"
],
"polyfills": [
"zone.js"
],
//...
```
:::
## Solution 2 - abstract the call
Abstract the call and separate it to backend only files and `inject` it only when we are running on the server.
**Step 1**, abstract the call - We'll remove the import to `fs,` and instead of calling specific `fs` methods, we'll define and call a method `writeToLog` that describes what we are trying to do:
```ts
import * as fs from 'fs'; // [!code --]
// We'll define an abstract `writeTiLog` function and use it in our code
static writeToLog:(textToWrite:string)=>void; // [!code ++]
.....
@BackendMethod({allowed:true})
static async updatePriceOnBackend(priceToUpdate:number,remult?:Remult){
let products = await remult.repo(Products).find();
for (const p of products) {
p.price.value += priceToUpdate;
await p.save();
}
fs.appendFileSync('./logs/log.txt', new Date() + " " + remult.user.name + " update price\n"); // [!code --]
ProductsController.writeToLog(new Date() + " " + remult.user.name + " update price\n"); // [!code ++]
}
```
The method `writeToLog` that we've defined serves as a place holder which we'll assign to in the remult of the server.
It receives one parameter of type `string` and returns `void`.
**Step 2**, implement the method:
In the `/src/app/server` folder, we'll add a file called `log-writer.ts` with the following code:
```ts{3}
import * as fs from 'fs';
import { ProductsController } from '../products/products.controller';
ProductsController.writeToLog = what => fs.appendFileSync('./logs/log.txt', what);
```
Here we set the implementation of the `writeToLog` method with the actual call to the `fs` module.
This file is intended to only run on the server, so it'll not present us with any problem.
**Step 3**, load the `log-writer.ts` file:
In the `/src/app/server/server-init.ts` file, load the `log-writer.ts` file using an `import` statement
```ts{2}
import '../app.module';
import './log-writer'; //load the log-writer.ts file
import { Pool } from 'pg';
import { config } from 'dotenv';
import { PostgresDataProvider, PostgresSchemaBuilder } from '@remult/server-postgres';
import * as passwordHash from 'password-hash';
```
That's it - it'll work now.
::: tip
If you're still getting an error - check that you have a `logs` folder on your project :)
:::
## Additional Resources
Check out this video where I implemented a similar solution when running into the same problem using `bcrypt`:
# Guides - Access Control
# Access Control
::: tip **Interactive Learning Available! 🚀**
Looking to get hands-on with this topic? Try out our new [**interactive tutorial**](https://learn.remult.dev/in-depth/7-access-control/1-field-level-control) on Access Control, where you can explore and practice directly in the browser. This guided experience offers step-by-step lessons to help you master Access Control in Remult with practical examples and exercises.
[Click here to dive into the interactive tutorial on Access Control!](https://learn.remult.dev/in-depth/7-access-control/1-field-level-control)
:::
Access control is essential for ensuring that users can only access resources they are authorized to in web applications. This article explores the various layers of access control, focusing on a framework that provides a granular approach to securing your application.
## Entity-Level Authorization
Entity-level authorization governs CRUD (Create, Read, Update, Delete) operations at the entity level. Each entity can define permissions for these operations using the following options:
- `allowApiRead`: Controls read access.
- `allowApiInsert`: Controls insert access.
- `allowApiUpdate`: Controls update access.
- `allowApiDelete`: Controls delete access.
Each option can be set to a boolean, a string role, an array of string roles, or an arrow function:
```typescript
// Allows all CRUD operations
@Entity("tasks", { allowApiCrud: true })
// Only users with the 'admin' role can update
@Entity("tasks", { allowApiUpdate: 'admin' })
// Only users with 'admin' or 'manager' roles can delete
@Entity("tasks", { allowApiDelete: ['admin', 'manager'] })
// Only the user 'Jane' can read
@Entity("tasks", { allowApiRead: () => remult.user?.name == 'Jane' })
// Only authenticated users can perform CRUD operations
@Entity("tasks", { allowApiCrud: Allow.authenticated })
```
## Row-Level Authorization
Row-level authorization allows control over which rows a user can access or modify.
### Authorization on Specific Rows
The `allowApiUpdate`, `allowApiDelete`, and `allowApiInsert` options can also accept a function that receives the specific item as the first parameter, allowing row-level authorization:
```ts
// Users can only update tasks they own
@Entity("tasks", { allowApiUpdate: task => task.owner == remult.user?.id })
```
### Filtering Accessible Rows
To limit the rows a user has access to, use the `apiPrefilter` option:
```ts
@Entity("tasks", {
apiPrefilter: () => {
// Admins can access all rows
if (remult.isAllowed("admin")) return {}
// Non-admins can only access rows where they are the owner
return { owner: remult.user!.id }
}
})
```
The `apiPrefilter` adds a filter to all CRUD API requests, ensuring that only authorized data is accessible through the API.
### Preprocessing Filters for API Requests
For more complex scenarios, you can use `apiPreprocessFilter` to dynamically modify the filter based on the specific request and additional filter information:
```ts
@Entity("tasks", {
apiPreprocessFilter: async (filter, {getPreciseValues}) => {
// Ensure that users can only query tasks for specific customers
const preciseValues = await getPreciseValues();
if (!preciseValues.customerId) {
throw new ForbiddenError("You must specify a valid customerId filter");
}
return filter;
}
})
```
In this example, `apiPreprocessFilter` uses the `getPreciseValues` method to ensure that users must specify a valid `customerId` filter when querying tasks, allowing for more granular control over the data that is accessible through the API.
**Note:** The `preciseValues` object includes the actual values that are used in the filter. For example, in the code sample above, if the `customerId` filter specifies the values `'1'`, `'2'`, and `'3'`, then `preciseValues.customerId` will be an array containing these values. This allows you to check and enforce specific filter criteria in your preprocessing logic.
This added note explains the significance of the `preciseValues` property and how it includes the actual values used in the filter, providing an example for clarity.
### Warning: API Filters Do Not Affect Backend Queries
It's important to note that `apiPrefilter` and `apiPreprocessFilter` only apply to API requests. They do not affect backend queries, such as those executed through backend methods or non-Remult routes.
For instance, in a sign-in scenario, a backend method might need to check all user records to verify a user's existence without exposing all user data through the API. Once authenticated, the user should only have access to their own record for updates.
### Backend Filters for Consistent Access Control
To apply similar filtering logic to backend queries, you can use `backendPrefilter` and `backendPreprocessFilter`:
```ts
@Entity("tasks", {
backendPrefilter: () => {
// Admins can access all rows
if (remult.isAllowed("admin")) return {}
// Non-admins can only access rows where they are the owner
return { owner: remult.user!.id }
},
backendPreprocessFilter: async (filter, {getPreciseValues}) => {
// Apply additional filtering logic for backend queries
const preciseValues = await getPreciseValues(filter);
if (!preciseValues.owner) {
throw new ForbiddenError("You must specify a valid owner filter");
}
return filter;
}
})
```
In this example, `backendPrefilter` and `backendPreprocessFilter` ensure that non-admin users can only access their own tasks in backend queries, providing consistent access control across both API and backend operations.
## Field-Level Authorization
Field-level authorization allows control over individual fields within an entity:
- `includeInApi`: Determines if the field is included in the API response.
- `allowApiUpdate`: Controls if a field can be updated. If false, any change to the field is ignored.
Examples:
```ts
// This field will not be included in the API response
@Fields.string({ includeInApi: false })
password = ""
// Only users with the 'admin' role can update this field
@Fields.boolean({ allowApiUpdate: "admin" })
admin = false
// Titles can only be updated by the task owner
@Fields.string({ allowApiUpdate: task => task.owner === remult.user!.id })
title=''
// This field can only be updated when creating a new entity
@Fields.string({ allowApiUpdate: (c) => getEntityRef(c).isNew() })
Description = ""
```
### Field Masking
To mask a field, combine a non-API field with a `serverExpression` that returns the masked value:
```ts
// This field is not included in the API response
@Fields.string({ includeInApi: false })
password = ""
// The field value is masked in the API response
@Fields.string({
serverExpression: () => "***",
// Update the real password field when the masked field is changed
saving: async (user, fieldRef, e) => {
if (fieldRef.valueChanged()) {
user.password = await User.hash(user.updatePassword)
}
},
})
updatePassword = ""
```
## BackendMethod Authorization
Backend methods use the `allowed` option to determine authorization:
```ts
// Only authenticated users can execute this method
@BackendMethod({ allowed: Allow.authenticated })
static async doSomething() {
// something
}
```
The `allowed` option can receive a boolean, a string role, an array of role strings, or a function.
## Reusing Access Control Definitions in the Frontend
Access control definitions set in entities can be reused as a single source of truth in the frontend. This allows for consistent and centralized management of access control logic across your application. For example, in a React component, you can conditionally render UI elements based on the access control rules defined in the entity:
::: code-group
```tsx [React]
function UserComponent({ user }: { user: User }) {
//...
return (
{user.name}
{/* Only show the admin field if the user is allowed to see it */}
{userRepo.fields.admin.includeInApi(user) &&
{user.admin}
}
{/* Only show the delete button if the user is allowed to delete the admin */}
{userRepo.metadata.apiDeleteAllowed(user) && (
```
:::
## Additional Resources
Check out this informative [YouTube video](https://www.youtube.com/watch?v=9lWQwAUcKEM). It discusses the concepts covered in this article and provides practical examples to help you understand how to implement robust access control in your applications.
---
This article provides a comprehensive overview of the layers of access control in web applications, offering a granular approach to securing your application at the entity, row, field, and method levels.
# Guides - Admin UI
# Admin UI
Enjoy a fully featured Admin UI for your entities, you can do CRUD operations on your entities, view their relationships via the Diagram entry, and ensure secure management with the same validations and authorizations as your application.
## Enabling the Admin UI
Add the Admin UI to your application by setting the `admin` option to `true` in the remult configuration.
```ts
export const api = remultSveltekit({
entities: [],
admin: true, // Enable the Admin UI
})
```
## Accessing and Using the Admin UI
Navigate to `/api/admin` to access the Admin UI. Here, you can perform CRUD operations on your entities, view their relationships via the Diagram entry, and ensure secure management with the same validations and authorizations as your application.
![Remult Admin](/remult-admin.png)
## Features
- **Entity List**: On the left side of the screen you have the entity list, you can use the search field to search for entities.
- **Entity Details**: Clicking on an entity in the menu will open the entity details screen (in the middle), here you can view filter & paginate your data _(top right)_. You can also see all relations of entity by clicking on the arrow on the left of each row. The last column is dedicated for actions where you can edit or delete an entity. On top left you can also add a new entity by clicking on the `+`.
- **Entity Diagram**: Clicking on the Diagram entry will open the entity diagram screen, here you can see the entity relationships.
![Remult Admin Diagram](/remult-admin-diagram.png)
- **Settings**: On top left, you have a menu _(remult logo)_ where you can find various settings for your admin ui.
- You want to confirm a delete all the time?
- You want to display Captions or Keys?
- Multiple options for automatic diagram layout (you want also do your own layout)
- You don't use cookies? No problem, you can set your bearer token (it will only be in session)
## Demo in video
Watch this quick demo to see the Remult Admin UI in action:
This video showcases the key features and functionality of the Remult Admin UI, giving you a practical overview of how it can streamline your entity management process.
# Escape Hatches - Custom/SQL Filters
# Leveraging Custom Filters for Enhanced Data Filtering
In modern web applications, efficiently filtering data is essential for providing a seamless user experience. Whether it's an e-commerce platform filtering products, a task management system sorting tasks, or any other application that requires data manipulation, the ability to apply complex filters is crucial. Custom filters offer a powerful solution, enabling developers to create reusable, declarative, and versatile filters that are executed on the backend and easily utilized from the frontend. This article delves into the concept of custom filters, illustrating their advantages and practical applications.
## The Advantages of Custom Filters
Custom filters provide several benefits that make them an attractive choice for handling data filtering in web applications:
1. **Declarative and Readable:** Custom filters allow you to express filtering logic in a clear, declarative manner. This improves code readability and maintainability, making it easier to understand and modify filtering criteria.
2. **Reusability:** By encapsulating filtering logic in custom filters, you can reuse the same filters across different parts of your application, reducing code duplication and ensuring consistency in filtering behavior.
3. **Backend Execution:** Custom filters are evaluated on the backend, leveraging the full capabilities of the underlying database or data provider. This enables more efficient data processing and allows you to perform complex operations that would be difficult or impossible to handle on the frontend.
4. **Composability:** Custom filters can be combined with other filters, both custom and standard, allowing you to build complex filtering logic in a modular and maintainable way.
5. **Flexibility with Data Providers:** Custom filters can be used with various data providers, including SQL databases, in-memory JSON arrays, and others. This flexibility allows you to apply custom filters in different contexts and with different data storage solutions.
6. **Enhanced Security:** When using custom filters with parameterized queries or data provider-specific filtering methods, you can mitigate the risk of injection attacks and ensure that user input is properly sanitized.
## Practical Example: Filtering Orders in an E-Commerce Application
Consider an e-commerce application where you need to filter orders based on their status and creation year. Without custom filters, the filtering logic might be repetitive and scattered throughout the codebase. By using custom filters, you can encapsulate this logic in a reusable component, simplifying the code and making it more maintainable.
In the following sections, we'll explore how to implement custom filters in this scenario, demonstrating their advantages and how they can be used to create more efficient and readable code.
## The Problem with Repetitive Filtering
Consider a scenario where you have an `Order` entity, and you frequently need to filter orders that are considered "active" based on their status and creation year. Without custom filters, your code might look something like this:
```ts
await repo(Order).find({
where: {
status: ['created', 'confirmed', 'pending', 'blocked', 'delayed'],
createdAt: {
$gte: new Date(year, 0, 1),
$lt: new Date(year + 1, 0, 1),
},
},
})
```
This code is not only repetitive but also clutters your application, making it harder to maintain. Moreover, it generates lengthy REST API calls, such as:
```
/api/orders?status.in=%5B%22created%22%2C%22confirmed%22%2C%22pending%22%2C%22blocked%22%2C%22delayed%22%5D&createdAt.gte=2023-12-31T22%3A00%3A00.000Z&createdAt.lt=2024-12-31T22%3A00%3A00.000Z
```
## Introducing Custom Filters
Custom filters allow you to refactor your filtering logic into a reusable and declarative component. Here's how you can define a custom filter for active orders:
```ts
class Order {
//...
static activeOrdersFor = Filter.createCustom(
async ({ year }) => {
return {
status: ['created', 'confirmed', 'pending', 'blocked', 'delayed'],
createdAt: {
$gte: new Date(year, 0, 1),
$lt: new Date(year + 1, 0, 1),
},
}
},
)
}
```
- **First Generic Parameter (`Order`):** This parameter specifies the entity class that the filter is associated with. In this case, it's the `Order` class. This is important because it ensures that the filter criteria you define are compatible with the fields and types of the `Order` entity.
- **Second Generic Parameter (`{ year: number }`):** This parameter defines the type of the argument that the filter will receive when executed. In this example, the filter expects an object with a single property `year` of type `number`. This allows you to pass dynamic values to the filter when you use it in a query, making the filter more flexible and reusable.
- **Callback Function (`async ({ year }) => { ... }`):** This function is where you define the actual filtering criteria. It receives an argument matching the type specified in the second generic parameter. Inside the function, you return an object representing the filter conditions. In this case, the conditions are based on the `status` and `createdAt` fields of the `Order` entity.
Now, you can use this custom filter in your queries:
```ts
await repo(Order).find({
where: Order.activeOrders({ year }),
})
```
This generates a much simpler REST API call:
```
/api/orders?%24custom%24activeOrders=%7B%22year%22%3A2024%7D
```
## Composability of Custom Filters
One of the key advantages of custom filters is their ability to be composed with other filters. This means you can combine custom filters with regular filters or even other custom filters to build complex filtering logic.
Let's take a closer look at the example you provided:
```ts
await repo(Order).find({
where: {
customerId: '123',
$and: [Order.activeOrders({ year })],
},
})
```
In this query, we're filtering orders based on two criteria:
1. The `customerId` should be "123".
2. The order should satisfy the conditions defined in the `activeOrders` custom filter for the specified year.
By using the `$and` operator, we're able to combine the custom filter with a regular filter. This demonstrates the composability of custom filters, allowing you to build more complex and nuanced filtering logic while maintaining readability and reusability.
### More on Composability
The power of composability doesn't stop there. You can also combine multiple custom filters to create even more specific filters. For example, suppose you have another custom filter called `highValueOrders` that filters orders based on their total value:
```ts
class Order {
//...
static highValueOrders = Filter.createCustom(() => {
return {
totalValue: { $gt: 1000 },
}
})
}
```
You can then combine this with the `activeOrders` filter to find high-value active orders for a specific year:
```ts
await repo(Order).find({
where: {
$and: [Order.activeOrders({ year }), Order.highValueOrders()],
},
})
```
This ability to compose filters allows you to create modular and reusable filtering logic, which can significantly improve the maintainability and clarity of your code.
### Evaluating Custom Filters on the Backend
One of the significant advantages of custom filters is that they are evaluated on the backend. This allows you to perform complex data-related operations that would be inefficient or impossible to do solely on the frontend. For instance, you can leverage database queries or other server-side logic to build your filtering criteria.
Let's examine the example you provided:
```ts
static activeOrders = Filter.createCustom<
Order,
{ year: number; customerCity: string }
>(async ({ year, customerCity }) => {
const customers = await repo(Customer).find({
where: { city: customerCity },
})
return {
customerId: { $in: customers.map((c) => c.id) },
status: ["created", "confirmed", "pending", "blocked", "delayed"],
createdAt: {
$gte: new Date(year, 0, 1),
$lt: new Date(year + 1, 0, 1),
},
}
})
```
In this example, the custom filter `activeOrders` now takes an additional parameter `customerCity`. The filter performs a database query to fetch all customers from the specified city. It then uses the IDs of these customers to filter orders that belong to them. This is combined with the existing criteria of filtering orders based on their status and creation year.
::: tip Key Points
- **Backend Evaluation:** The filter is evaluated on the backend, where it has access to the database and can perform efficient queries. This offloads complex data processing from the frontend to the backend, where it can be handled more effectively.
- **Complex Filtering:** By leveraging backend capabilities, you can create filters that involve complex operations, such as fetching related data from other tables or entities (in this case, fetching customers based on their city).
- **Asynchronous Operations:** Notice the use of `async` in the filter definition. This allows you to perform asynchronous operations, such as database queries, within your custom filter.
:::
## Leveraging Database Capabilities with Raw SQL in Custom Filters
Since custom filters are **evaluated on the backend**, you have the opportunity to harness the raw capabilities of the underlying database. This can be particularly useful when you need to perform complex operations that are more efficiently handled by the database itself. For instance, you can use raw SQL queries to improve the performance or functionality of your custom filters.
Let's modify the `activeOrders` custom filter to use a raw SQL query for filtering orders based on the customer's city:
```ts
static activeOrders = Filter.createCustom<
Order,
{ year: number; customerCity: string }
>(async ({ year, customerCity }) => {
return {
status: ["created", "confirmed", "pending", "blocked", "delayed"],
createdAt: {
$gte: new Date(year, 0, 1),
$lt: new Date(year + 1, 0, 1),
},
$and: [
SqlDatabase.rawFilter(({param}) => // [!code highlight]
`"customerId" in (select id from customers where city = ${param(customerCity)})` // [!code highlight]
), // [!code highlight]
],
}
})
```
In this example, we've added a `$and` condition that uses `SqlDatabase.rawFilter` to include a raw SQL fragment in our filter. This SQL fragment selects the IDs of customers from the specified city and uses them to filter the orders.
This generates the following sql:
```sql
select "id", "status", "customerId", "createdAt"
from "orders"
where "status" in ($2, $3, $4, $5, $6)
and "createdAt" >= $7
and "createdAt" < $8
and ("customerId" in (select id from customers where city = $9))
Order By "id"
```
#### Important Notes
- **Parameterized Queries:** It's crucial to use parameterized queries (e.g., `builder.param(customerCity)`) when incorporating user-supplied values into your SQL queries. This helps prevent SQL injection attacks by ensuring that user input is properly escaped.
- **Performance Considerations:** Leveraging raw SQL can lead to significant performance improvements, especially for complex queries. However, it's important to ensure that your SQL queries are well-optimized to avoid potential performance issues.
#### Usage Example
Using the custom filter remains straightforward:
```ts
await repo(Order).find({
where: Order.activeOrders({ year: 2024, customerCity: 'New York' }),
})
```
### Using `dbNamesOf` with Table Names and Aliases
The `dbNamesOf` utility function can be customized to include the table name in the SQL queries. This is particularly useful for ensuring consistency between your entity definitions and your raw SQL queries.
Here's an updated example of the `activeOrders` custom filter using `dbNamesOf` with table names and aliases:
```ts
static activeOrders = Filter.createCustom<
Order,
{ year: number; customerCity: string }
>(async ({ year, customerCity }) => {
const order = await dbNamesOf(Order, { // [!code highlight]
tableName: true, // [!code highlight]
})
const customer = await dbNamesOf(Customer, { // [!code highlight]
tableName: "c", // [!code highlight]
}) // [!code highlight]
return {
status: ["created", "confirmed", "pending", "blocked", "delayed"],
createdAt: {
$gte: new Date(year, 0, 1),
$lt: new Date(year + 1, 0, 1),
},
$and: [
SqlDatabase.rawFilter(({param}) => // [!code highlight]
`${order.customerId} in (select ${customer.id} from ${customer} as c // [!code highlight]
where ${customer.city} = ${param(customerCity)})` // [!code highlight]
),
],
}
})
```
In this example:
- The `Order` table is referenced with its full name.
- The `Customer` table is aliased as `"c"`, and this alias is used in the SQL query.
### Explanation of `tableName` and Aliases
- **`tableName: true`:** By setting `tableName: true`, you indicate that you want to include the table name when referring to fields, resulting in SQL expressions like `"customer"."id"`.
- **Aliases:** You can use aliases for table names, which is particularly useful in complex join scenarios. For example, setting `tableName: "c"` would use the alias `"c"` for the table name in the SQL query.
### Resulting SQL Query
Using the `activeOrders` custom filter with the enhancements mentioned above would generate the following SQL query:
```sql
select "id", "status", "customerId", "createdAt"
from "orders"
where "status" in ($2, $3, $4, $5, $6)
and "createdAt" >= $7
and "createdAt" < $8
and ("orders"."customerId" in (select "c"."id" from "customers" as c
where c."city" = $9))
Order By "id"
```
In this SQL query, the `Customer` table is aliased as `"c"`, and this alias is used throughout the query to ensure consistency with the entity definitions and to handle complex join scenarios effectively.
### SQL-Based Custom Filters: Unleashing the Power of Composability
The greatest advantage of using SQL-based custom filters lies in their composability and the ability to handle complex situations. By breaking down filtering logic into smaller, atomic custom filters, developers can compose these filters to create more sophisticated and nuanced filtering criteria. This modular approach not only enhances the readability and maintainability of the code but also allows for greater flexibility in constructing complex queries.
For instance, consider a scenario where you need to filter orders based on multiple criteria, such as status, creation year, customer location, and order value. By creating separate custom filters for each of these criteria, you can easily combine them to form a comprehensive filtering solution. This composability ensures that your filtering logic can adapt to various requirements without becoming convoluted or difficult to manage.
Furthermore, the ability to handle complex situations is a significant advantage of SQL-based custom filters. By leveraging the raw power of SQL, you can perform advanced operations such as subqueries, joins, and aggregate functions directly within your filters. This opens up a wide range of possibilities for data analysis and manipulation, enabling you to tackle complex filtering scenarios with ease.
SQL is a language that is widely recognized and understood by AI technologies such as ChatGPT, Copilot and others. This makes it possible to generate highly optimized queries with ease. These AI technologies can assist in writing SQL queries, ensuring they are efficient and effective. This is particularly beneficial when dealing with complex data structures and large datasets, where writing optimal queries can be challenging. With the assistance of AI, developers can focus more on the logic of their applications, while the AI handles the intricacies of SQL query optimization.
In summary, the composability of SQL-based custom filters, coupled with their ability to handle complex situations, makes them an invaluable tool for developers seeking to create flexible, efficient, and powerful data filtering solutions in their web applications.
### Using Raw Filters with Different Data Providers
Custom filters with raw filters are not limited to SQL databases. You can also use raw filters with other data providers, such as Knex or an in-memory JSON data provider. This flexibility allows you to leverage the power of raw filters in various contexts, depending on your application's needs.
#### Knex Example
Knex is a popular SQL query builder for Node.js. You can use Knex with custom filters to define complex filtering logic directly using the Knex query builder syntax.
```typescript
static idBetween = Filter.createCustom(
({ from, to }) => {
return KnexDataProvider.rawFilter(({ knexQueryBuilder }) => {
knexQueryBuilder.andWhereBetween('id', [from, to]);
});
}
);
```
In this example, the `idBetween` custom filter uses Knex to filter `Task` entities whose `id` falls between the specified `from` and `to` values.
#### JSON Example
For applications that use an in-memory JSON data provider, you can define custom filters that operate directly on the JSON data.
```typescript
static titleLengthFilter = Filter.createCustom(
({ minLength }) => {
return ArrayEntityDataProvider.rawFilter((item) => {
return item.title?.length > minLength;
});
}
);
```
In this example, the `titleLengthFilter` custom filter filters `Task` entities based on the length of their `title` property, ensuring that it exceeds the specified `minLength`.
## Conclusion
Custom filters represent a powerful tool in the arsenal of web developers, offering a flexible and efficient way to handle data filtering in web applications. By encapsulating filtering logic into reusable components, custom filters not only enhance code readability and maintainability but also enable the execution of complex filtering operations on the backend. This leads to improved performance and security, as well as the ability to compose intricate filtering criteria with ease.
The versatility of custom filters extends to their compatibility with various data providers, from SQL databases to in-memory JSON arrays, allowing developers to leverage the most suitable data handling mechanisms for their specific use cases. Moreover, the declarative nature of custom filters ensures that the filtering logic remains clear and concise, facilitating easier debugging and future modifications.
In conclusion, adopting custom filters in your web development projects can significantly streamline the process of data filtering, resulting in cleaner, more efficient, and more secure code. By embracing this approach, developers can focus on delivering a seamless user experience, confident in the knowledge that their data filtering logic is both robust and adaptable.
# Escape Hatches - Direct Database Access
# Accessing the Underlying Database in Remult
While Remult provides a powerful abstraction for working with databases, there might be scenarios where you need to access the underlying database directly. This could be for performing complex queries, optimizations, or other database-specific operations that are not covered by Remult's API.
:::warning
Directly executing custom SQL can be dangerous and prone to SQL injection attacks. Always use parameterized queries and the `param` method provided by Remult to safely include user input in your queries.
:::
## Accessing SQL Databases
For SQL-based databases, Remult provides the SqlDatabase class to interact directly with the database and allows you to run raw SQL queries directly. This is useful for executing complex queries that involve operations like GROUP BY, bulk updates, and other advanced SQL features.
### Basic SQL Query
```typescript
const sql = SqlDatabase.getDb()
const result = await sql.execute('SELECT COUNT(*) AS count FROM tasks')
console.log(result.rows[0].count)
```
This approach is straightforward but can lead to inconsistencies if the database schema changes.
#### the `dbNamesOf` function:
The `dbNamesOf` function dynamically retrieves the database table and column names based on your entity definitions, ensuring that your queries stay in sync with your data model. This enhances consistency, maintainability, and searchability in your code.
```typescript
const tasks = await dbNamesOf(Task)
const sql = SqlDatabase.getDb()
const result = await sql.execute(`SELECT COUNT(*) AS count FROM ${tasks}`)
console.log(result.rows[0].count)
```
##### Create index example
```typescript
const tasks = await dbNamesOf(Task)
const sql = SqlDatabase.getDb()
await sql.execute(`CREATE INDEX idx_task_title ON ${tasks}(${tasks.title});`)
```
### Using Bound Parameters
The `param` method safely incorporates user input into the query, reducing the risk of SQL injection by using parameterized queries.
```typescript
const priceToUpdate = 5
const products = await dbNamesOf(Product)
const sql = SqlDatabase.getDb()
let command = sql.createCommand()
await command.execute(
`UPDATE ${products} SET ${products.price} = ${
products.price
} + ${command.param(priceToUpdate)}`,
)
```
When executed, this code will run the following SQL:
```sql
UPDATE products SET price = price + $1
Arguments: { '$1': 5 }
```
### Leveraging EntityFilter for SQL Databases
The `filterToRaw` function converts Remult's `EntityFilter` objects into SQL where clauses, enabling you to incorporate complex filtering logic defined in your models into custom SQL queries. This allows for reusability and integration with backend filters.
#### Benefits of filterToRaw
- **Reusability**: Allows you to reuse complex filters defined in your Remult models in custom SQL queries.
- **Integration**: Respects any **backendPrefilter** and **backendPreprocessFilter** applied to your entities, ensuring consistent access control and data manipulation rules.
```typescript
const order = await dbNamesOf(Order)
const sql = SqlDatabase.getDb()
const command = sql.createCommand()
const filterSql = await SqlDatabase.filterToRaw(
Order,
{
status: ['created', 'confirmed', 'pending', 'blocked', 'delayed'],
createdAt: {
$gte: new Date(year, 0, 1),
$lt: new Date(year + 1, 0, 1),
},
},
command,
)
const result = await command.execute(
`SELECT COUNT(*) FROM ${order} WHERE ${filterSql}`,
)
console.log(result.rows[0].count)
```
Resulting SQL:
```sql
SELECT COUNT(*) FROM "orders"
WHERE "status" IN ($1, $2, $3, $4, $5) AND "createdAt" >= $6 AND "createdAt" < $7
```
Using `customFilter`:
```typescript
const order = await dbNamesOf(Order)
const sql = SqlDatabase.getDb()
const command = sql.createCommand()
const filterSql = await SqlDatabase.filterToRaw(
Order,
Order.activeOrders({ year, customerCity: 'London' }),
command,
)
const result = await command.execute(
`SELECT COUNT(*) FROM ${order} WHERE ${filterSql}`,
)
console.log(result.rows[0].count)
```
Resulting SQL:
```sql
SELECT COUNT(*) FROM "orders"
WHERE "status" IN ($1, $2, $3, $4, $5) AND "createdAt" >= $6 AND "createdAt" < $7 AND ("orders"."customerId" IN (
SELECT "customers"."id" FROM "customers"
WHERE "customers"."city" = $8
))
```
## Accessing Other Databases
## Knex
```typescript
const tasks = await dbNamesOf(Task)
const knex = KnexDataProvider.getDb()
const result = await knex(tasks.$entityName).count()
console.log(result[0].count)
```
### Leveraging EntityFilter for Knex
```ts
const tasks = await dbNamesOf(Task)
const knex = KnexDataProvider.getDb()
const r = await knex(tasks.$entityName)
.count()
.where(await KnexDataProvider.filterToRaw(Task, { id: [1, 3] }))
console.log(r[0].count)
```
## MongoDB
```ts
const tasks = await dbNamesOf(Task)
const mongo = MongoDataProvider.getDb()
const r = await(await mongo.collection(tasks.$entityName)).countDocuments()
console.log(r)
```
### Leveraging EntityFilter for MongoDb
```ts
const tasks = await dbNamesOf(Task)
const mongo = MongoDataProvider.getDb()
const r = await(await mongo.collection(tasks.$entityName)).countDocuments(
await MongoDataProvider.filterToRaw(Task, { id: [1, 2] }),
)
console.log(r)
```
## Native postgres
```ts
const tasks = await dbNamesOf(Task)
const sql = PostgresDataProvider.getDb()
const r = await sql.query(`select count(*) as c from ${tasks}`)
console.log(r.rows[0].c)
```
## Conclusion
Accessing the underlying database directly in Remult provides the flexibility to handle complex use cases that might not be covered by the ORM layer. However, it's important to use this capability judiciously and securely, especially when dealing with user input, to avoid potential security vulnerabilities like SQL injection. By leveraging utilities like `dbNamesOf` and `filterToRaw
# Escape Hatches - Using Remult in Non-Remult Routes
---
keywords:
[
Error: remult object was requested outside of a valid context,
try running it within initApi or a remult request cycle,
]
---
# Using Remult in Non-Remult Routes
When using the CRUD api or [BackendMethods](./backendMethods.md), `remult` is automatically available. Still, there are many use cases where you may want to user remult in your own routes or other code without using `BackendMethods` but would still want to take advantage of `Remult` as an ORM and use it to check for user validity, etc...
If you tried to use the `remult` object, you may have got the error:
## Error: remult object was requested outside of a valid context, try running it within initApi or a remult request cycle
Here's how you can use remult in this context, according to the server you're using:
::: tabs
== Express
### withRemult middleware
You can use remult as an express middleware for a specific route, using `api.withRemult`
```ts{1}
app.post('/api/customSetAll', api.withRemult, async (req, res) => {
// ....
})
```
Or as an express middleware for multiple routes
```ts
app.use(api.withRemult) // [!code highlight]
app.post('/api/customSetAll', async (req, res) => {
// ....
})
```
### withRemultAsync promise wrapper
Use the `api.withRemultAsync` method in promises
```ts
import express from 'express'
import { remultExpress } from 'remult/remult-express'
const app = express();
...
const api = remultExpress({
entities:[Task]
})
app.post('/api/customSetAll', async (req, res) => {
// use remult in a specific piece of code // [!code highlight]
await api.withRemultAsync(req, async ()=> { // [!code highlight]
if (!remult.authenticated()) {
res.sendStatus(403);
return;
}
if (!remult.isAllowed("admin")) {
res.sendStatus(403);
return;
}
const taskRepo = remult.repo(Task);
for (const task of await taskRepo.find()) {
task.completed = req.body.completed;
await taskRepo.save(task);
}
res.send();
})
});
```
You can also use it without sending the request object, for non request related code
```ts{2}
setInterval(async () => {
api.withRemultAsync(undefined, async () => {
// ....
})
}, 10000)
```
== Fastify
```ts
import fastify from 'fastify'
import { remultFastify } from 'remult/remult-fastify'
(async () => {
const server = fastify()
await server.register(remultFastify({})) // [!code highlight]
server.get('/api/test', async (req, res) => {
return {
result: await api.withRemult(req, () => remult.repo(Task).count()), // [!code highlight]
}
})
server.listen({ port: 3000 })
})()
```
== Hono
```ts
import { Hono } from 'hono'
import { remultHono } from 'remult/remult-hono'
const app = new Hono()
const api = remultHono({})
app.get('/test1', api.withRemult, async (c) => // [!code highlight]
c.text('hello ' + (await repo(Task).count())),
)
app.route('', api)
export default app
```
== Next.js app router
```ts
// src/app/api/test/route.ts
import { NextResponse } from 'next/server'
import { repo } from 'remult'
import { Task } from '../../../shared/task'
import { api } from '../../../api'
export const dynamic = 'force-dynamic'
export async function GET(req: Request) {
return api.withRemult(async () => {
return NextResponse.json({
result: repo(Task).count(),
user: remult.user,
})
})
}
```
== Sveltekit
You can use the `withRemult` method in specific routes
```ts
// src/routes/api/test/+server.ts
import { json, type RequestHandler } from '@sveltejs/kit'
import { remult } from 'remult'
import { Task } from '../../../shared/Task'
import { api } from '../../../server/api'
export const GET: RequestHandler = async (event) => {
return api.withRemult(event, async () =>
json({ result: await remult.repo(Task).count() }),
)
}
```
You can also define the withRemult as a hook, to make remult available throughout the application
```ts
// src/hooks.server.ts
import type { Handle } from '@sveltejs/kit'
import { sequence } from '@sveltejs/kit/hooks'
import { api as handleRemult } from './server/api'
export const handle = sequence(
// Handle remult server side
handleRemult,
)
```
== SolidStart
You can use the `withRemult` method in specific routes
```ts
// src/routes/api/test.ts
import { remult } from 'remult'
import { Task } from '../../../shared/Task'
import { api } from '../../../server/api'
export function GET() {
return api.withRemult(event, async () =>
({ result: await remult.repo(Task).count() }),
)
}
```
You can also use the same method for any "use server" function
```ts
export function getCount(){
return api.withRemult(event, async () =>
({ result: await remult.repo(Task).count() }),
)
}
```
You can also define the withRemult as a hook, to make remult available throughout the application
```ts
// src/hooks.server.ts
import type { Handle } from '@sveltejs/kit'
import { sequence } from '@sveltejs/kit/hooks'
import { api as handleRemult } from './server/api'
export const handle = sequence(
// Handle remult server side
handleRemult,
)
```
== Hapi
```ts
import { type Plugin, server } from '@hapi/hapi'
import { remultHapi } from 'remult/remult-hapi'
(async () => {
const hapi = server({ port: 3000 })
const api = remultHapi({})
await hapi.register(api) // [!code highlight]
server.route({
method: 'GET',
path: '/api/test2',
handler: async (request, h) => {
return api.withRemult(request, async () => {
return {
result: await remult.repo(Task).count(),
}
})
},
})
hapi.start()
})()
```
:::
# Escape Hatches - Avoiding Decorators
# Working without decorators
If you prefer to work without decorators, or use `remult` in a javascript project (without typescript) you can use the following:
## Entity
::: code-group
```ts [Typescript]
import { Entity, Fields, describeEntity } from 'remult'
export class Task {
id!: string
title = ''
completed = false
}
describeEntity(
Task,
'tasks',
{
allowApiCrud: true,
},
{
id: Fields.uuid(),
title: Fields.string(),
completed: Fields.boolean(),
},
)
```
```js [Javascript]
import { Entity, Fields, describeEntity } from 'remult'
export class Task {
id
title = ''
completed = false
}
describeEntity(
Task,
'tasks',
{
allowApiCrud: true,
},
{
id: Fields.uuid(),
title: Fields.string(),
completed: Fields.boolean(),
},
)
```
:::
This is the same entity that is detailed in the [Entities section of the tutorial](https://remult.dev/tutorials/react/entities.html)
## Static BackendMethod
```ts{12-14}
import { BackendMethod, describeBackendMethods, remult } from "remult";
import { Task } from "./Task";
export class TasksController {
static async setAll(completed: boolean) {
const taskRepo = remult.repo(Task);
for (const task of await taskRepo.find()) {
await taskRepo.save({ ...task, completed });
}
}
}
describeBackendMethods(TasksController, {
setAll: { allowed: "admin" }
})
```
This is the same backend method that is detailed in the [Backend methods of the tutorial](https://remult.dev/tutorials/react/backend-methods.html#refactor-from-front-end-to-back-end)
# Escape Hatches - Extensibility
---
tags:
- options
- bespoke options
- customizing options
- type augmentation
- module augmentation
- UserInfo
- RemultContext
- context
---
# Extensibility
[Module Augmentation](https://www.typescriptlang.org/docs/handbook/declaration-merging.html#module-augmentation) in TypeScript allows you to extend existing types with custom properties or methods. This enhances the functionality of third-party libraries like `remult` without altering their source code, enabling seamless integration of custom features while maintaining type safety.
In Remult, you can use TypeScript's module augmentation to enhance your application with custom features. Here are some examples:
1. **Add more fields to the User object:** Extend the `UserInfo` interface to include additional fields like `email` and `phone`.
2. **Add custom options/metadata to fields and entities:** Extend the `FieldOptions` or `EntityOptions` interfaces to include custom properties such as `placeholderText` or `helpText`.
3. **Add fields/methods to the `remult.context` object:** Extend the `RemultContext` interface to include additional properties or methods that can be accessed throughout your code.
## Setting Up the types.d.ts File for Custom Type Extensions
To set up the `types.d.ts` file for custom type extensions in Remult:
1. **Create a TypeScript Declaration File:** Add a file named `types.d.ts` in the `src` folder of your project. This file will be used to declare custom type extensions, such as additional user info fields.
```ts
// src/types.d.ts
export {}
declare module 'remult' {
interface UserInfo {
phone: string // [!code highlight]
email: string // [!code highlight]
}
}
```
The `export {}` is required to indicate that this file is a module, as per the [Vue.js documentation on augmenting global properties](https://vuejs.org/guide/typescript/options-api.html#augmenting-global-properties).
2. **Include the Declaration File in tsconfig:** Make sure that the `types.d.ts` file is included in the `include` section of your `tsconfig.json` file. If you have a separate `tsconfig` for the server, ensure that it's also added there.
```json
// tsconfig.server.json
{
"compilerOptions": {
//...
},
"include": ["src/server/**/*", "src/shared/**/*", "src/types.d.ts"] // [!code highlight]
}
```
3. **Utilize the Custom Fields in Your Code:** Once you've defined custom fields in the `types.d.ts` file and ensured they're included in your `tsconfig.json`, you can start using them throughout your application. For instance, if you've added `phone` and `email` to the `UserInfo` interface, you can access these properties in your code as follows:
```ts
// Accessing custom user info fields
console.log(remult.user.phone)
console.log(remult.user.email)
```
This enables you to seamlessly integrate the new fields into your application's logic and user interface.
## Enhancing Field and Entity Definitions with Custom Options
One of the key motivations for adding custom options to `FieldOptions` or `EntityOptions` is to maintain consistency and centralize the definition of entities and fields in your application. By keeping these definitions close to the entity or field, you ensure a single source of truth for your application's data model. This approach enhances maintainability and readability, as all relevant information and metadata about an entity or field are located in one place. Additionally, it allows for easier integration with UI components, as custom options like `placeholderText` can be directly accessed and used in your frontend code.
For adding custom options to `FieldOptions` or `EntityOptions`, such as `placeholderText`:
1. **Extend FieldOptions:** In your `types.d.ts` file, extend the `FieldOptions` interface to include your custom options. For example:
```ts
declare module 'remult' {
interface FieldOptions {
placeholderText?: string // [!code highlight]
}
}
export {}
```
2. **Set Custom Option:** Specify the `placeholderText` in your entity field options:
```ts
import { Entity, Fields } from 'remult'
@Entity('tasks', { allowApiCrud: true })
export class Task {
@Fields.uuid()
id!: string
@Fields.string({
placeholderText: 'Please enter a task title', // [!code highlight]
})
title = ''
@Fields.boolean()
completed = false
}
```
3. **Use in UI:** Access the custom option in your UI components:
```html{2}
```
By following these steps, you can extend `FieldOptions` with custom options that can be utilized throughout your project.
### Extending Remult's `context` Property for Request-Specific Information
Augmenting Remult's `context` property is particularly useful because it allows you to store and access request-specific information throughout your code. This can be especially handy for including data from the request and utilizing it in entities or backend methods.
For example, you can add a custom property `origin` to the `RemultContext` interface:
```ts
declare module 'remult' {
export interface RemultContext {
origin?: string // [!code highlight]
}
}
```
Then, set the `origin` property in the `initRequest` option in the `api.ts` file:
```ts
export const api = remultExpress({
initRequest: async (_, req) => {
remult.context.origin = req.headers.origin // [!code highlight]
},
entities: [Task],
//...
})
```
You can now use the `origin` property anywhere in your code, for example:
```ts
@BackendMethod({ allowed: Roles.admin })
static async doSomethingImportant() {
console.log(remult.context.origin); // [!code highlight]
}
```
or in an entity's saving event:
```ts
@Entity("tasks", {
saving: task => {
task.lastUpdateDate = new Date();
task.lastUpdateUser = remult.user?.name;
task.lastUpdateOrigin = remult.context.origin; // [!code highlight]
},
//...
});
```
By leveraging module augmentation, you can tailor Remult to your specific needs, adding custom options and extending interfaces to suit your application's requirements.
# Integrations - Open API
# Adding Swagger and openApi
In short, swagger provides a quick UI that describes the api which is exposed by the application.
To add swagger to a `remult` application follow these steps:
1. Install the `swagger-ui-express` package:
```sh
npm i swagger-ui-express
npm i --save-dev @types/swagger-ui-express
```
2. In the `/src/server/index.ts` file add the following code:
```ts{2,6-9}
import express from 'express';
import swaggerUi from 'swagger-ui-express';
import { remultExpress } from 'remult/remult-express';
const app = express();
let api = remultExpress();
app.use(api);
const openApiDocument = api.openApiDoc({ title: "remult-react-todo" });
app.get("/api/openApi.json", (req, res) => {res.json(openApiDocument)});
app.use('/api/docs', swaggerUi.serve, swaggerUi.setup(openApiDocument));
app.listen(3002, () => console.log("Server started"));
```
## Adding Swagger UI to a NextJs App
To add swagger UI to a `NextJs` application follow these steps:
1. Install the following packages:
```sh
# With npm
npm i swagger-ui-react
npm i -D @types/swagger-ui-react
# With yarn
yarn add swagger-ui-react
yarn add -D @types/swagger-ui-react
```
2. Get the openApi document from RemultNextAppServer:
```ts
// src/api.ts
import { Task } from '@/shared/Task'
import { TasksController } from '@/shared/TasksController'
import { remultNextApp } from 'remult/remult-next'
export const api = remultNextApp({
admin: true,
entities: [Task],
controllers: [TasksController],
})
// Export this here 👇
export const openApiDoc = api.openApiDoc({
title: 'Todo App',
})
export const { POST, PUT, DELETE, GET } = api
```
3. Create a new page to render Swagger UI:
```tsx
// src/app/api-doc/page.tsx
import { openApiDoc } from '@/api' // 👈 Import the openApiDoc you exported earlier
import ReactSwagger from './react-swagger'
export default async function IndexPage() {
return (
)
}
```
```tsx
// src/app/api-doc/react-swagger.tsx
'use client'
import SwaggerUI from 'swagger-ui-react'
import 'swagger-ui-react/swagger-ui.css'
type Props = {
spec: Record
}
function ReactSwagger({ spec }: Props) {
return
}
export default ReactSwagger
```
4. Navigate to `http://localhost:3000/api-doc` to see the Swagger UI.
![Remult Admin](../public/example_remult-next-swagger-ui-page.png)
# Adding open api specific field options
Checkout the following example project that demos how to add `openApi` specific options to field options
[stackblitz](https://stackblitz.com/github/noam-honig/adding-open-api-options?file=server/build-open-api.ts,shared/task.ts)
[github](https://www.github.com/noam-honig/adding-open-api-options)
# Integrations - GraphQL
# Adding Graphql
To add graphql to a `remult` application follow these steps:
1. Install the `graphql-yoga` packages:
```sh
npm i graphql-yoga
```
## Express:
In the `/src/server/index.ts` file add the following code:
```ts{3-4,12-22}
import express from 'express';
import { remultExpress } from 'remult/remult-express';
import { createSchema, createYoga } from 'graphql-yoga'
import { remultGraphql } from 'remult/graphql';
const app = express()
const entities = [Task]
let api = remultExpress({
entities
});
app.use(api);
const { typeDefs, resolvers } = remultGraphql({
entities
});
const yoga = createYoga({
graphqlEndpoint: '/api/graphql',
schema: (createSchema({
typeDefs,
resolvers
}))
})
app.use(yoga.graphqlEndpoint, api.withRemult, yoga)
app.listen(3002, () => console.log("Server started"));
```
## Next App Router
```ts
// Next.js Custom Route Handler: https://nextjs.org/docs/app/building-your-application/routing/router-handlers
import { createYoga, createSchema } from 'graphql-yoga'
import { remultGraphql } from 'remult/graphql'
import { api } from '../../../api'
import { Task } from '../../../shared/task'
const { typeDefs, resolvers } = remultGraphql({
entities: [Task],
})
const yoga = createYoga({
// While using Next.js file convention for routing, we need to configure Yoga to use the correct endpoint
graphqlEndpoint: '/api/graphql',
schema: createSchema({
typeDefs,
resolvers,
}),
// Yoga needs to know how to create a valid Next response
fetchAPI: { Response },
})
const handleRequest = (request: any, ctx: any) =>
api.withRemult(() => yoga.handleRequest(request, ctx))
export { handleRequest as GET, handleRequest as POST }
```
## Svelte
`src/routes/api/graphql/+server.ts`
```ts
import type { RequestEvent } from '@sveltejs/kit'
import { createSchema, createYoga } from 'graphql-yoga'
import { remultGraphql } from 'remult/graphql'
import { Task } from '../../../shared/Task'
const { typeDefs, resolvers } = remultGraphql({
entities: [Task],
})
const yogaApp = createYoga({
schema: createSchema({
typeDefs,
resolvers,
}),
// While using Next.js file convention for routing, we need to configure Yoga to use the correct endpoint
graphqlEndpoint: '/api/graphql',
fetchAPI: { Response },
})
export { yogaApp as GET, yogaApp as OPTIONS, yogaApp as POST }
```
# API Reference - Entity
# Entity
Decorates classes that should be used as entities.
Receives a key and an array of EntityOptions.
#### example:
```ts
import { Entity, Fields } from "remult";
@Entity("tasks", {
allowApiCrud: true
})
export class Task {
@Fields.uuid()
id!: string;
@Fields.string()
title = '';
@Fields.boolean()
completed = false;
}
```
#### note:
EntityOptions can be set in two ways:
#### example:
```ts
// as an object
@Entity("tasks",{ allowApiCrud:true })
```
#### example:
```ts
// as an arrow function that receives `remult` as a parameter
@Entity("tasks", (options,remult) => options.allowApiCrud = true)
```
## caption
A human readable name for the entity
## allowApiRead
Determines if this Entity is available for get requests using Rest Api
#### description:
Determines if one has any access to the data of an entity.
#### see:
- [allowed](http://remult.dev/docs/allowed.html)
- to restrict data based on a criteria, use [apiPrefilter](https://remult.dev/docs/ref_entity.html#apiprefilter)
## allowApiUpdate
Determines if this entity can be updated through the api.
#### see:
- [allowed](http://remult.dev/docs/allowed.html)
- [Access Control](https://remult.dev/docs/access-control)
## allowApiDelete
Determines if entries for this entity can be deleted through the api.
#### see:
- [allowed](http://remult.dev/docs/allowed.html)
- [Access Control](https://remult.dev/docs/access-control)
## allowApiInsert
Determines if new entries for this entity can be posted through the api.
#### see:
- [allowed](http://remult.dev/docs/allowed.html)
- [Access Control](https://remult.dev/docs/access-control)
## allowApiCrud
sets the `allowApiUpdate`, `allowApiDelete` and `allowApiInsert` properties in a single set
## apiPrefilter
An optional filter that determines which rows can be queried using the API.
This filter is applied to all CRUD operations to ensure that only authorized data is accessible.
Use `apiPrefilter` to restrict data based on user profile or other conditions.
#### example:
```ts
// Only include non-archived items in API responses
apiPrefilter: { archive: false }
```
#### example:
```ts
// Allow admins to access all rows, but restrict non-admins to non-archived items
apiPrefilter: () => remult.isAllowed("admin") ? {} : { archive: false }
```
#### see:
[EntityFilter](https://remult.dev/docs/access-control.html#filtering-accessible-rows)
## apiPreprocessFilter
An optional function that allows for preprocessing or modifying the EntityFilter for a specific entity type
before it is used in API CRUD operations. This function can be used to enforce additional access control
rules or adjust the filter based on the current context or specific request.
#### example:
```typescript
@Entity("tasks", {
apiPreprocessFilter: async (filter, { getPreciseValues }) => {
// Ensure that users can only query tasks for specific customers
const preciseValues = await getPreciseValues();
if (!preciseValues.customerId) {
throw new ForbiddenError("You must specify a valid customerId filter");
}
return filter;
}
})
```
## backendPreprocessFilter
Similar to apiPreprocessFilter, but for backend operations.
## backendPrefilter
A filter that will be used for all queries from this entity both from the API and from within the backend.
#### example:
```ts
backendPrefilter: { archive:false }
```
#### see:
[EntityFilter](http://remult.dev/docs/entityFilter.html)
## defaultOrderBy
An order by to be used, in case no order by was specified
#### example:
```ts
defaultOrderBy: { name: "asc" }
```
#### example:
```ts
defaultOrderBy: { price: "desc", name: "asc" }
```
## saving
An event that will be fired before the Entity will be saved to the database.
If the `error` property of the entity's ref or any of its fields will be set, the save will be aborted and an exception will be thrown.
this is the place to run logic that we want to run in any case before an entity is saved.
#### example:
```ts
@Entity("tasks", {
saving: async (task, e) => {
if (e.isNew) {
task.createdAt = new Date(); // Set the creation date for new tasks.
}
task.lastUpdated = new Date(); // Update the last updated date.
},
})
```
#### link:
LifeCycleEvent object
#### see:
[Entity Lifecycle Hooks](http://remult.dev/docs/lifecycle-hooks)
## saved
A hook that runs after an entity has been successfully saved.
#### link:
LifeCycleEvent object
#### see:
[Entity Lifecycle Hooks](http://remult.dev/docs/lifecycle-hooks)
## deleting
A hook that runs before an entity is deleted.
#### link:
LifeCycleEvent object
#### see:
[Entity Lifecycle Hooks](http://remult.dev/docs/lifecycle-hooks)
## deleted
A hook that runs after an entity has been successfully deleted.
#### link:
LifeCycleEvent object
#### see:
[Entity Lifecycle Hooks](http://remult.dev/docs/lifecycle-hooks)
## validation
A hook that runs to perform validation checks on an entity before saving.
This hook is also executed on the frontend.
#### link:
LifeCycleEvent object
#### see:
[Entity Lifecycle Hooks](http://remult.dev/docs/lifecycle-hooks)
## dbName
The name of the table in the database that holds the data for this entity.
If no name is set, the `key` will be used instead.
#### example:
```ts
dbName:'myProducts'
You can also add your schema name to the table name
```
#### example:
```ts
dbName:'public."myProducts"'
```
## sqlExpression
For entities that are based on SQL expressions instead of a physical table or view
#### example:
```ts
@Entity('people', {
sqlExpression:`select id,name from employees
union all select id,name from contractors`,
})
export class Person {
@Fields.string()
id=''
@Fields.string()
name=''
}
```
## id
An arrow function that identifies the `id` column to use for this entity
#### example:
```ts
//Single column id
@Entity("products", { id: 'productCode' })
```
#### example:
```ts
//Multiple columns id
@Entity("orderDetails", { id:['orderId:', 'productCode'] })
```
## entityRefInit
Arguments:
* **ref**
* **row**
## apiRequireId
* **apiRequireId**
# API Reference - Field
# Field
Decorates fields that should be used as fields.
for more info see: [Field Types](https://remult.dev/docs/field-types.html)
FieldOptions can be set in two ways:
#### example:
```ts
// as an object
@Fields.string({ includeInApi:false })
title='';
```
#### example:
```ts
// as an arrow function that receives `remult` as a parameter
@Fields.string((options,remult) => options.includeInApi = true)
title='';
```
## valueType
The value type for this field
## caption
A human readable name for the field. Can be used to achieve a consistent caption for a field throughout the app
#### example:
```ts
```
## allowNull
If it can store null in the database
## required
If a value is required
## includeInApi
Specifies whether this field should be included in the API. This can be configured
based on access control levels.
#### example:
```ts
// Do not include in the API
@Fields.string({ includeInApi: false })
password = '';
// Include in the API for 'admin' only
@Fields.number({ includeInApi: 'admin' })
salary = 0;
```
#### see:
- [allowed](https://remult.dev/docs/allowed.html)
- [Access Control](https://remult.dev/docs/access-control)
## allowApiUpdate
Determines whether this field can be updated via the API. This setting can also
be controlled based on user roles or other access control checks.
#### example:
```ts
// Prevent API from updating this field
@Fields.string({ allowApiUpdate: false })
createdBy = remult.user?.id;
```
#### see:
- [allowed](https://remult.dev/docs/allowed.html)
- [Access Control](https://remult.dev/docs/access-control)
## validate
An arrow function that'll be used to perform validations on it
#### example:
```ts
@Fields.string({
validate: Validators.required
})
*
```
#### example:
```ts
@Fields.string({
validate: task=>task.title.length>3 || "Too Short"
})
```
#### example:
```ts
@Fields.string({
validate: task=>{
if (task.title.length<3)
throw "Too Short";
}
})
```
#### example:
```ts
@Fields.string({
validate: (_, fieldValidationEvent)=>{
if (fieldValidationEvent.value.length < 3)
fieldValidationEvent.error = "Too Short";
}
})
```
## saving
Will be fired before this field is saved to the server/database
## serverExpression
An expression that will determine this fields value on the backend and be provided to the front end
## dbName
The name of the column in the database that holds the data for this field. If no name is set, the key will be used instead.
#### example:
```ts
@Fields.string({ dbName: 'userName'})
userName=''
```
## sqlExpression
Used or fields that are based on an sql expressions, instead of a physical table column
#### example:
```ts
@Fields.integer({
sqlExpression:e=> 'length(title)'
})
titleLength = 0;
@Fields.string()
title='';
```
## dbReadOnly
For fields that shouldn't be part of an update or insert statement
## valueConverter
The value converter to be used when loading and saving this field
## displayValue
an arrow function that translates the value to a display value
## defaultValue
an arrow function that determines the default value of the field, when the entity is created using the `repo.create` method
## inputType
The html input type for this field
## lazy
* **lazy**
## target
The entity type to which this field belongs
## key
The key to be used for this field
# API Reference - ValueConverter
# ValueConverter
Interface for converting values between different formats, such as in-memory objects, database storage,
JSON data transfer objects (DTOs), and HTML input elements.
## fromJson
Converts a value from a JSON DTO to the valueType. This method is typically used when receiving data
from a REST API call or deserializing a JSON payload.
#### returns:
The converted value.
#### example:
```ts
fromJson: val => new Date(val)
```
Arguments:
* **val** - The value to convert.
## toJson
Converts a value of valueType to a JSON DTO. This method is typically used when sending data
to a REST API or serializing an object to a JSON payload.
#### returns:
The converted value.
#### example:
```ts
toJson: val => val?.toISOString()
```
Arguments:
* **val** - The value to convert.
## fromDb
Converts a value from the database format to the valueType.
#### returns:
The converted value.
#### example:
```ts
fromDb: val => new Date(val)
```
Arguments:
* **val** - The value to convert.
## toDb
Converts a value of valueType to the database format.
#### returns:
The converted value.
#### example:
```ts
toDb: val => val?.toISOString()
```
Arguments:
* **val** - The value to convert.
## toInput
Converts a value of valueType to a string suitable for an HTML input element.
#### returns:
The converted value as a string.
#### example:
```ts
toInput: (val, inputType) => val?.toISOString().substring(0, 10)
```
Arguments:
* **val** - The value to convert.
* **inputType** - The type of the input element (optional).
## fromInput
Converts a string from an HTML input element to the valueType.
#### returns:
The converted value.
#### example:
```ts
fromInput: (val, inputType) => new Date(val)
```
Arguments:
* **val** - The value to convert.
* **inputType** - The type of the input element (optional).
## displayValue
Returns a displayable string representation of a value of valueType.
#### returns:
The displayable string.
#### example:
```ts
displayValue: val => val?.toLocaleDateString()
```
Arguments:
* **val** - The value to convert.
## fieldTypeInDb
Specifies the storage type used in the database for this field. This can be used to explicitly define the data type and precision of the field in the database.
#### example:
```ts
// Define a field with a specific decimal precision in the database
@Fields.number({
valueConverter: {
fieldTypeInDb: 'decimal(18,8)'
}
})
price=0;
```
## inputType
Specifies the type of HTML input element suitable for values of valueType.
#### example:
```ts
inputType = 'date';
```
# API Reference - Validation
# Validation
Validation is a key part of any application, and you will see that it's builtin Remult ! Let's dive into it...
First of all, some props brings automatic validation, for example `required` and `minLength` for strings :
```ts
@Fields.string({ minLength: 5 })
title = ''
```
You can establish your own validation rules by using the `validate` prop and do any custom code you want :
```ts
@Fields.string({
validate: (task)=> task.title.length > 5 || "too short"
})
title = ''
```
You want to focus only on the value?
```ts
@Fields.string({
validate: valueValidator(value => value.length > 5)
})
title = ''
```
The `validate` prop can also use buildin validators like this :
```ts
import { Validators } from 'remult'
@Fields.string({
validate: Validators.minLength(5)
})
title = ''
```
It supports array of validators as well :
```ts
import { Validators } from 'remult'
@Fields.string({
validate: [
Validators.minLength(5),
Validators.maxLength(10),
(task)=> task.title.startsWith('No') || "Need to start with No"
]
})
title = ''
```
Some validators like `unique` is running on the backend side, and nothing changes, you just have to use it :
```ts
import { Validators } from 'remult'
@Fields.string({
validate: [
Validators.minLength(5),
Validators.unique()
]
})
title = ''
```
Also in custom validator you can check if you are in the backend or not :
```ts
import { Validators, isBackend } from 'remult'
@Fields.string({
validate: [
Validators.unique(),
(task) => {
if (isBackend()) {
// check something else...
// throw "a custom message"
}
}
]
})
title = ''
```
If you want to customize the error message, you can do it globally :
```ts
Validators.unique.defaultMessage = 'Existe déjà!'
```
# API Reference - Validators
# Validators
Class containing various field validators.
## constructor
* **new Validators**
## defaultMessage
* **defaultMessage**
## email
Validator to check if a value is a valid email address.
## enum
Validator to check if a value exists in a given enum.
## in
Validator to check if a value is one of the specified values.
## max
Validator to check if a value is less than or equal to a maximum value.
## maxLength
Validator to check if a string's length is less than or equal to a maximum length.
## min
Validator to check if a value is greater than or equal to a minimum value.
## minLength
Validator to check if a string's length is greater than or equal to a minimum length.
## notNull
Validator to check if a value is not null.
## range
Validator to check if a value is within a specified range.
## regex
Validator to check if a value matches a given regular expression.
## relationExists
Validator to check if a related value exists in the database.
## required
Validator to check if a value is required (not null or empty).
## unique
Validator to ensure a value is unique in the database.
## uniqueOnBackend
* **uniqueOnBackend**
## url
Validator to check if a value is a valid URL.
# API Reference - Relations
# Relations
* **Relations**
## constructor
* **new Relations**
## toMany
Define a toMany relation between entities, indicating a one-to-many relationship.
This method allows you to establish a relationship where one entity can have multiple related entities.
#### returns:
A decorator function to apply the toMany relation to an entity field.
Example usage:
```
@Relations.toMany(() => Order)
orders?: Order[];
// or with a custom field name:
@Relations.toMany(() => Order, "customerId")
orders?: Order[];
```
Arguments:
* **toEntityType**
* **fieldInToEntity** - (Optional) The field in the target entity that represents the relation.
Use this if you want to specify a custom field name for the relation.
## toOne
Define a to-one relation between entities, indicating a one-to-one relationship.
If no field or fields are provided, it will automatically create a field in the database
to represent the relation.
#### returns:
A decorator function to apply the to-one relation to an entity field.
Example usage:
```
@Relations.toOne(() => Customer)
customer?: Customer;
```
```
Fields.string()
customerId?: string;
@Relations.toOne(() => Customer, "customerId")
customer?: Customer;
```
```
Fields.string()
customerId?: string;
@Relations.toOne(() => Customer, {
field: "customerId",
defaultIncluded: true
})
customer?: Customer;
```
```
Fields.string()
customerId?: string;
@Relations.toOne(() => Customer, {
fields: {
customerId: "id",
},
})
customer?: Customer;
```
Arguments:
* **toEntityType**
* **options** - (Optional): An object containing options for configuring the to-one relation.
* **caption** - A human readable name for the field. Can be used to achieve a consistent caption for a field throughout the app
#### example:
```ts
```
* **fields** - An object specifying custom field names for the relation.
Each key represents a field in the related entity, and its value is the corresponding field in the source entity.
For example, `{ customerId: 'id' }` maps the 'customerId' field in the related entity to the 'id' field in the source entity.
This is useful when you want to define custom field mappings for the relation.
* **field** - The name of the field for this relation.
* **findOptions** - Find options to apply to the relation when fetching related entities.
You can specify a predefined set of find options or provide a function that takes the source entity
and returns find options dynamically.
These options allow you to customize how related entities are retrieved.
* **defaultIncluded** - Determines whether the relation should be included by default when querying the source entity.
When set to true, related entities will be automatically included when querying the source entity.
If false or not specified, related entities will need to be explicitly included using the `include` option.
# API Reference - RelationOptions
# RelationOptions
Options for configuring a relation between entities.
## caption
A human readable name for the field. Can be used to achieve a consistent caption for a field throughout the app
#### example:
```ts
```
## fields
An object specifying custom field names for the relation.
Each key represents a field in the related entity, and its value is the corresponding field in the source entity.
For example, `{ customerId: 'id' }` maps the 'customerId' field in the related entity to the 'id' field in the source entity.
This is useful when you want to define custom field mappings for the relation.
## field
The name of the field for this relation.
## findOptions
Find options to apply to the relation when fetching related entities.
You can specify a predefined set of find options or provide a function that takes the source entity
and returns find options dynamically.
These options allow you to customize how related entities are retrieved.
## defaultIncluded
Determines whether the relation should be included by default when querying the source entity.
When set to true, related entities will be automatically included when querying the source entity.
If false or not specified, related entities will need to be explicitly included using the `include` option.
# API Reference - Remult
# Remult
* **Remult**
## repo
Return's a `Repository` of the specific entity type
#### example:
```ts
const taskRepo = remult.repo(Task);
```
#### see:
[Repository](https://remult.dev/docs/ref_repository.html)
Arguments:
* **entity** - the entity to use
* **dataProvider** - an optional alternative data provider to use. Useful for writing to offline storage or an alternative data provider
## user
Returns the current user's info
## initUser
Fetches user information from the backend and updates the `remult.user` object.
Typically used during application initialization and user authentication.
#### returns:
A promise that resolves to the user's information or `undefined` if unavailable.
## authenticated
Checks if a user was authenticated
## isAllowed
checks if the user has any of the roles specified in the parameters
#### example:
```ts
remult.isAllowed("admin")
```
#### see:
[Allowed](https://remult.dev/docs/allowed.html)
Arguments:
* **roles**
## isAllowedForInstance
checks if the user matches the allowedForInstance callback
#### see:
[Allowed](https://remult.dev/docs/allowed.html)
Arguments:
* **instance**
* **allowed**
## useFetch
* **useFetch**
Arguments:
* **fetch**
## dataProvider
The current data provider
## constructor
Creates a new instance of the `remult` object.
Can receive either an HttpProvider or a DataProvider as a parameter - which will be used to fetch data from.
If no provider is specified, `fetch` will be used as an http provider
Arguments:
* **http**
## call
Used to call a `backendMethod` using a specific `remult` object
#### example:
```ts
await remult.call(TasksController.setAll, undefined, true);
```
Arguments:
* **backendMethod** - the backend method to call
* **classInstance** - the class instance of the backend method, for static backend methods use undefined
* **args** - the arguments to send to the backend method
## onFind
A helper callback that can be used to debug and trace all find operations. Useful in debugging scenarios
Arguments:
* **metadata**
* **options**
* **limit** - Determines the number of rows returned by the request, on the browser the default is 100 rows
#### example:
```ts
await this.remult.repo(Products).find({
limit:10,
page:2
})
```
* **page** - Determines the page number that will be used to extract the data
#### example:
```ts
await this.remult.repo(Products).find({
limit:10,
page:2
})
```
* **load**
* **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included
when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems.
#### param:
An object specifying the related entities to include, their options, and filtering criteria.
Example usage:
```
const orders = await customerRepo.find({
include: {
// Include the 'tags' relation for each customer.
tags: true,
},
});
```
In this example, the `tags` relation for each customer will be loaded and included in the query result.
#### see:
- Relations.toMany
- Relations.toOne
- RelationOptions
* **where** - filters the data
#### example:
```ts
await taskRepo.find({where: { completed:false }})
```
#### see:
For more usage examples see [EntityFilter](https://remult.dev/docs/entityFilter.html)
* **orderBy** - Determines the order of items returned .
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { name: "asc" }})
```
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { price: "desc", name: "asc" }})
```
## clearAllCache
* **clearAllCache**
## entityRefInit
A helper callback that is called whenever an entity is created.
## context
context information that can be used to store custom information that will be disposed as part of the `remult` object
## apiClient
The api client that will be used by `remult` to perform calls to the `api`
## liveQueryStorage
* **liveQueryStorage**
## subscriptionServer
* **subscriptionServer**
## liveQueryPublisher
* **liveQueryPublisher**
## liveQuerySubscriber
* **liveQuerySubscriber**
# API Reference - ApiClient
# ApiClient
Interface for configuring the API client used by Remult to perform HTTP calls to the backend.
## httpClient
The HTTP client to use when making API calls. It can be set to a function with the `fetch` signature
or an object that has `post`, `put`, `delete`, and `get` methods. This can also be used to inject
logic before each HTTP call, such as adding authorization headers.
#### example:
```ts
// Using Axios
remult.apiClient.httpClient = axios;
```
#### example:
```ts
// Using Angular HttpClient
remult.apiClient.httpClient = httpClient;
```
#### see:
If you want to add headers using angular httpClient, see: https://medium.com/angular-shots/shot-3-how-to-add-http-headers-to-every-request-in-angular-fab3d10edc26
#### example:
```ts
// Using fetch (default)
remult.apiClient.httpClient = fetch;
```
#### example:
```ts
// Adding bearer token authorization
remult.apiClient.httpClient = (
input: RequestInfo | URL,
init?: RequestInit
) => {
return fetch(input, {
...init,
headers: authToken
? {
...init?.headers,
authorization: 'Bearer ' + authToken,
}
: init?.headers,
cache: 'no-store',
})
}
```
## url
The base URL for making API calls. By default, it is set to '/api'. It can be modified to be relative
or to use a different domain for the server.
#### example:
```ts
// Relative URL
remult.apiClient.url = './api';
```
#### example:
```ts
// Different domain
remult.apiClient.url = 'https://example.com/api';
```
## subscriptionClient
The subscription client used for real-time data updates. By default, it is set to use Server-Sent Events (SSE).
It can be set to any subscription provider as illustrated in the Remult tutorial for deploying to a serverless environment.
#### see:
https://remult.dev/tutorials/react-next/deployment.html#deploying-to-a-serverless-environment
## wrapMessageHandling
A function that wraps message handling for subscriptions. This is useful for executing some code before
or after any message arrives from the subscription.
For example, in Angular, to refresh a specific part of the UI,
you can call the `NgZone` run method at this time.
#### example:
```ts
// Angular example
import { Component, NgZone } from '@angular/core';
import { remult } from "remult";
export class AppComponent {
constructor(zone: NgZone) {
remult.apiClient.wrapMessageHandling = handler => zone.run(() => handler());
}
}
```
# API Reference - Repository
# Repository
used to perform CRUD operations on an `entityType`
## find
returns a result array based on the provided options
Arguments:
* **options**
* **limit** - Determines the number of rows returned by the request, on the browser the default is 100 rows
#### example:
```ts
await this.remult.repo(Products).find({
limit:10,
page:2
})
```
* **page** - Determines the page number that will be used to extract the data
#### example:
```ts
await this.remult.repo(Products).find({
limit:10,
page:2
})
```
* **load**
* **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included
when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems.
#### param:
An object specifying the related entities to include, their options, and filtering criteria.
Example usage:
```
const orders = await customerRepo.find({
include: {
// Include the 'tags' relation for each customer.
tags: true,
},
});
```
In this example, the `tags` relation for each customer will be loaded and included in the query result.
#### see:
- Relations.toMany
- Relations.toOne
- RelationOptions
* **where** - filters the data
#### example:
```ts
await taskRepo.find({where: { completed:false }})
```
#### see:
For more usage examples see [EntityFilter](https://remult.dev/docs/entityFilter.html)
* **orderBy** - Determines the order of items returned .
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { name: "asc" }})
```
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { price: "desc", name: "asc" }})
```
## liveQuery
returns a result array based on the provided options
Arguments:
* **options**
* **limit** - Determines the number of rows returned by the request, on the browser the default is 100 rows
#### example:
```ts
await this.remult.repo(Products).find({
limit:10,
page:2
})
```
* **page** - Determines the page number that will be used to extract the data
#### example:
```ts
await this.remult.repo(Products).find({
limit:10,
page:2
})
```
* **load**
* **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included
when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems.
#### param:
An object specifying the related entities to include, their options, and filtering criteria.
Example usage:
```
const orders = await customerRepo.find({
include: {
// Include the 'tags' relation for each customer.
tags: true,
},
});
```
In this example, the `tags` relation for each customer will be loaded and included in the query result.
#### see:
- Relations.toMany
- Relations.toOne
- RelationOptions
* **where** - filters the data
#### example:
```ts
await taskRepo.find({where: { completed:false }})
```
#### see:
For more usage examples see [EntityFilter](https://remult.dev/docs/entityFilter.html)
* **orderBy** - Determines the order of items returned .
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { name: "asc" }})
```
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { price: "desc", name: "asc" }})
```
## findFirst
returns the first item that matchers the `where` condition
#### example:
```ts
await taskRepo.findFirst({ completed:false })
```
#### example:
```ts
await taskRepo.findFirst({ completed:false },{ createIfNotFound: true })
```
Arguments:
* **where** - filters the data
#### see:
[EntityFilter](http://remult.dev/docs/entityFilter.html)
* **options**
* **load**
* **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included
when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems.
#### param:
An object specifying the related entities to include, their options, and filtering criteria.
Example usage:
```
const orders = await customerRepo.find({
include: {
// Include the 'tags' relation for each customer.
tags: true,
},
});
```
In this example, the `tags` relation for each customer will be loaded and included in the query result.
#### see:
- Relations.toMany
- Relations.toOne
- RelationOptions
* **where** - filters the data
#### example:
```ts
await taskRepo.find({where: { completed:false }})
```
#### see:
For more usage examples see [EntityFilter](https://remult.dev/docs/entityFilter.html)
* **orderBy** - Determines the order of items returned .
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { name: "asc" }})
```
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { price: "desc", name: "asc" }})
```
* **useCache** - determines if to cache the result, and return the results from cache.
* **createIfNotFound** - If set to true and an item is not found, it's created and returned
## findOne
returns the first item that matchers the `where` condition
#### example:
```ts
await taskRepo.findOne({ where:{ completed:false }})
```
#### example:
```ts
await taskRepo.findFirst({ where:{ completed:false }, createIfNotFound: true })
```
Arguments:
* **options**
* **load**
* **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included
when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems.
#### param:
An object specifying the related entities to include, their options, and filtering criteria.
Example usage:
```
const orders = await customerRepo.find({
include: {
// Include the 'tags' relation for each customer.
tags: true,
},
});
```
In this example, the `tags` relation for each customer will be loaded and included in the query result.
#### see:
- Relations.toMany
- Relations.toOne
- RelationOptions
* **where** - filters the data
#### example:
```ts
await taskRepo.find({where: { completed:false }})
```
#### see:
For more usage examples see [EntityFilter](https://remult.dev/docs/entityFilter.html)
* **orderBy** - Determines the order of items returned .
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { name: "asc" }})
```
#### example:
```ts
await this.remult.repo(Products).find({ orderBy: { price: "desc", name: "asc" }})
```
* **useCache** - determines if to cache the result, and return the results from cache.
* **createIfNotFound** - If set to true and an item is not found, it's created and returned
## findId
returns the items that matches the id. If id is undefined | null, returns null
Arguments:
* **id**
* **options**
* **load**
* **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included
when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems.
#### param:
An object specifying the related entities to include, their options, and filtering criteria.
Example usage:
```
const orders = await customerRepo.find({
include: {
// Include the 'tags' relation for each customer.
tags: true,
},
});
```
In this example, the `tags` relation for each customer will be loaded and included in the query result.
#### see:
- Relations.toMany
- Relations.toOne
- RelationOptions
* **useCache** - determines if to cache the result, and return the results from cache.
* **createIfNotFound** - If set to true and an item is not found, it's created and returned
## groupBy
Performs an aggregation on the repository's entity type based on the specified options.
#### returns:
The result of the aggregation.
#### example:
```ts
// Grouping by country and city, summing the salary field, and ordering by country and sum of salary:
const results = await repo.groupBy({
group: ['country', 'city'],
sum: ['salary'],
where: {
salary: { $ne: 1000 },
},
orderBy: {
country: 'asc',
salary: {
sum: 'desc',
},
},
});
// Accessing the results:
console.log(results[0].country); // 'uk'
console.log(results[0].city); // 'London'
console.log(results[0].$count); // count for London, UK
console.log(results[0].salary.sum); // Sum of salaries for London, UK
```
Arguments:
* **options** - The options for the aggregation.
* **group** - Fields to group by. The result will include one entry per unique combination of these fields.
* **sum** - Fields to sum. The result will include the sum of these fields for each group.
* **avg** - Fields to average. The result will include the average of these fields for each group.
* **min** - Fields to find the minimum value. The result will include the minimum value of these fields for each group.
* **max** - Fields to find the maximum value. The result will include the maximum value of these fields for each group.
* **distinctCount** - Fields to count distinct values. The result will include the distinct count of these fields for each group.
* **where** - Filters to apply to the query before aggregation.
#### see:
EntityFilter
* **orderBy** - Fields and aggregates to order the results by.
The result can be ordered by groupBy fields, sum fields, average fields, min fields, max fields, and distinctCount fields.
## aggregate
Performs an aggregation on the repository's entity type based on the specified options.
#### returns:
The result of the aggregation.
#### example:
```ts
// Aggregating (summing the salary field across all items):
const totalSalary = await repo.aggregate({
sum: ['salary'],
});
console.log(totalSalary.salary.sum); // Outputs the total sum of salaries
```
Arguments:
* **options** - The options for the aggregation.
## query
Fetches data from the repository in a way that is optimized for handling large sets of entity objects.
Unlike the `find` method, which returns an array, the `query` method returns an iterable `QueryResult` object.
This allows for more efficient data handling, particularly in scenarios that involve paging through large amounts of data.
The method supports pagination and aggregation in a single request. When aggregation options are provided,
the result will include both the items from the current page and the results of the requested aggregation.
The `query` method is designed for asynchronous iteration using the `for await` statement.
#### example:
```ts
// Basic usage with asynchronous iteration:
for await (const task of taskRepo.query()) {
// Perform some operation on each task
}
```
#### example:
```ts
// Querying with pagination:
const query = taskRepo.query({
where: { completed: false },
pageSize: 100,
});
let paginator = await query.paginator();
console.log('Number of items on the current page:', paginator.items.length);
console.log('Total pages:', Math.ceil(paginator.aggregate.$count / 100));
if (paginator.hasNextPage) {
paginator = await paginator.nextPage();
console.log('Items on the next page:', paginator.items.length);
}
```
#### example:
```ts
// Querying with aggregation:
const query = await repo.query({
where: { completed: false },
pageSize: 50,
aggregates: {
sum: ['salary'],
average: ['age'],
}
});
let paginator = await query.paginator();
// Accessing paginated items
console.table(paginator.items);
// Accessing aggregation results
console.log('Total salary:', paginator.aggregates.salary.sum); // Sum of all salaries
console.log('Average age:', paginator.aggregates.age.average); // Average age
```
Arguments:
* **options**
## count
Returns a count of the items matching the criteria.
#### see:
[EntityFilter](http://remult.dev/docs/entityFilter.html)
#### example:
```ts
await taskRepo.count({ completed:false })
```
Arguments:
* **where** - filters the data
#### see:
[EntityFilter](http://remult.dev/docs/entityFilter.html)
## validate
Validates an item
#### example:
```ts
const error = repo.validate(task);
if (error){
alert(error.message);
alert(error.modelState.title);//shows the specific error for the title field
}
// Can also be used to validate specific fields
const error = repo.validate(task,"title")
```
Arguments:
* **item**
* **fields**
## save
saves an item or item[] to the data source. It assumes that if an `id` value exists, it's an existing row - otherwise it's a new row
#### example:
```ts
await taskRepo.save({...task, completed:true })
```
Arguments:
* **item**
## insert
Insert an item or item[] to the data source
#### example:
```ts
await taskRepo.insert({title:"task a"})
```
#### example:
```ts
await taskRepo.insert([{title:"task a"}, {title:"task b", completed:true }])
```
Arguments:
* **item**
## update
Updates an item, based on its `id`
#### example:
```ts
taskRepo.update(task.id,{...task,completed:true})
```
Arguments:
* **id**
* **item**
## updateMany
Updates all items that match the `where` condition.
Arguments:
* **options**
* **where** - filters the data
#### see:
[EntityFilter](http://remult.dev/docs/entityFilter.html)
* **set**
## upsert
Inserts a new entity or updates an existing entity based on the specified criteria.
If an entity matching the `where` condition is found, it will be updated with the provided `set` values.
If no matching entity is found, a new entity will be created with the given data.
The `upsert` method ensures that a row exists based on the `where` condition: if no entity is found, a new one is created.
It can handle both single and multiple upserts.
#### returns:
A promise that resolves with the inserted or updated entity, or an array of entities if multiple options were provided.
#### example:
```ts
// Upserting a single entity: updates 'task a' if it exists, otherwise creates it.
taskRepo.upsert({ where: { title: 'task a' }, set: { completed: true } });
```
#### example:
```ts
// Upserting a single entity without additional `set` values: ensures that a row with the title 'task a' exists.
taskRepo.upsert({ where: { title: 'task a' } });
```
#### example:
```ts
// Upserting multiple entities: ensures both 'task a' and 'task b' exist, updating their `completed` status if found.
taskRepo.upsert([
{ where: { title: 'task a' }, set: { completed: true } },
{ where: { title: 'task b' }, set: { completed: true } }
]);
```
Arguments:
* **options** - The options that define the `where` condition and the `set` values. Can be a single object or an array of objects.
## delete
Deletes an Item
Arguments:
* **id**
## deleteMany
Deletes all items that match the `where` condition.
Arguments:
* **options**
* **where** - filters the data
#### see:
[EntityFilter](http://remult.dev/docs/entityFilter.html)
## create
Creates an instance of an item. It'll not be saved to the data source unless `save` or `insert` will be called.
It's useful to start or reset a form taking your entity default values into account.
Arguments:
* **item**
## toJson
* **toJson**
Arguments:
* **item**
## fromJson
Translates a json object to an item instance
Arguments:
* **x**
* **isNew**
## getEntityRef
returns an `entityRef` for an item returned by `create`, `find` etc...
Arguments:
* **item**
## fields
Provides information about the fields of the Repository's entity
#### example:
```ts
console.log(repo.fields.title.caption) // displays the caption of a specific field
console.log(repo.fields.title.options)// writes the options that were defined for this field
```
## metadata
The metadata for the `entity`
#### See:
[EntityMetadata](https://remult.dev/docs/ref_entitymetadata.html)
## addEventListener
* **addEventListener**
Arguments:
* **listener**
## relations
* **relations**
Arguments:
* **item**
# API Reference - RemultServerOptions
# RemultServerOptions
* **RemultServerOptions**
## entities
Entities to use for the api
## controllers
Controller to use for the api
## getUser
Will be called to get the current user based on the current request
## initRequest
Will be called for each request and can be used for configuration
## initApi
Will be called once the server is loaded and the data provider is ready
## dataProvider
Data Provider to use for the api.
#### see:
[Connecting to a Database](https://remult.dev/docs/databases.html).
## ensureSchema
Will create tables and columns in supporting databases. default: true
#### description:
when set to true, it'll create entities that do not exist, and add columns that are missing.
## rootPath
The path to use for the api, default:/api
#### description:
If you want to use a different api path adjust this field
## defaultGetLimit
The default limit to use for find requests that did not specify a limit
## logApiEndPoints
When set to true (default) it'll console log each api endpoint that is created
## subscriptionServer
A subscription server to use for live query and message channels
## liveQueryStorage
A storage to use to store live queries, relevant mostly for serverless scenarios or larger scales
## contextSerializer
Used to store the context relevant info for re running a live query
## admin
When set to true, will display an admin ui in the `/api/admin` url.
Can also be set to an arrow function for fine grained control
#### example:
```ts
admin: true
```
#### example:
```ts
admin: ()=> remult.isAllowed('admin')
```
#### see:
[allowed](http://remult.dev/docs/allowed.html)
## queueStorage
Storage to use for backend methods that use queue
## error
This method is called whenever there is an error in the API lifecycle.
#### example:
```ts
export const api = remultExpress({
error: async (e) => {
if (e.httpStatusCode == 400) {
e.sendError(500, { message: "An error occurred" })
}
}
})
```
# API Reference - EntityFilter
---
tags:
- Where
- Filter
- Entity Where
- Entity Filter
---
# EntityFilter
Used to filter the desired result set
### Basic example
```ts
where: {
status: 1
}
```
( this will include only items where the status is equal to 1.
### In Statement
```ts
where:{ status:[1,3,5] }
//or
where:{ status:{ $in:[1,3,5]: } }
```
### Not Equal
```ts
where:{ status:{ "!=":1 }}
//or
where:{ status:{ $ne:1 }}
```
### Not in
```ts
where:{status:{ "!=":[1,2,3] }}
//or
where:{status:{ $ne:[1,2,3] }}
//or
where:{status:{ $nin:[1,2,3] }}
```
### Comparison operators
```ts
where:{ status:{ ">":1 }}
where:{ status:{ ">=":1 }}
where:{ status:{ "<":1 }}
where:{ status:{ "<=":1 }}
//or
where:{ status:{ $gt:1 }}
where:{ status:{ $gte:1 }}
where:{ status:{ $lt:1 }}
where:{ status:{ $lte:1 }}
```
### Contains
```ts
where: {
name: {
$contains: 'joe'
}
}
```
### Not Contains
```ts
where: {
name: {
$notContains: 'joe'
}
}
```
### Starts With
```ts
where: {
name: {
$startsWith: 'joe'
}
}
```
### Ends With
```ts
where: {
name: {
$endsWith: 'joe'
}
}
```
### Id Equal
```ts
where: {
person: {
$id: 123456
}
}
```
### Multiple conditions has an `and` relationship
```ts
where: {
status:1,
archive:false
}
```
### $and
```ts
where: {
$and: [{ status: 1 }, { archive: false }]
}
```
### $or
```ts
where: {
$or: [{ status: 1 }, { archive: false }]
}
```
### $not
```ts
where: {
$not: {
status: 1
}
}
```
# API Reference - EntityMetadata
# EntityMetadata
Metadata for an `Entity`, this metadata can be used in the user interface to provide a richer UI experience
## entityType
The class type of the entity
## key
The Entity's key also used as it's url
## fields
Metadata for the Entity's fields
## caption
A human readable caption for the entity. Can be used to achieve a consistent caption for a field throughout the app
#### example:
```ts
Create a new item in {taskRepo.metadata.caption}
```
#### see:
EntityOptions.caption
## dbName
The name of the table in the database that holds the data for this entity.
If no name is set in the entity options, the `key` will be used instead.
#### see:
EntityOptions.dbName
## options
The options send to the `Entity`'s decorator
#### see:
EntityOptions
## apiUpdateAllowed
true if the current user is allowed to update an entity instance
#### see:
* @example
Arguments:
* **item**
## apiReadAllowed
true if the current user is allowed to read from entity
#### see:
EntityOptions.allowApiRead
#### example:
```ts
const taskRepo = remult.repo(Task);
if (taskRepo.metadata.apiReadAllowed){
await taskRepo.find()
}
```
## apiDeleteAllowed
true if the current user is allowed to delete an entity instance
*
#### see:
EntityOptions.allowApiDelete
#### example:
```ts
const taskRepo = remult.repo(Task);
if (taskRepo.metadata.apiDeleteAllowed(task)){
// display delete button
}
```
Arguments:
* **item**
## apiInsertAllowed
true if the current user is allowed to create an entity instance
#### see:
EntityOptions.allowApiInsert
#### example:
```ts
const taskRepo = remult.repo(Task);
if (taskRepo.metadata.apiInsertAllowed(task)){
// display insert button
}
```
Arguments:
* **item**
## getDbName
* **getDbName**
## idMetadata
Metadata for the Entity's id
#### see:
EntityOptions.id for configuration
# API Reference - FieldMetadata
# FieldMetadata
Metadata for a `Field`, this metadata can be used in the user interface to provide a richer UI experience
## valueType
The field's value type (number,string etc...)
## key
The field's member name in an object.
#### example:
```ts
const taskRepo = remult.repo(Task);
console.log(taskRepo.metadata.fields.title.key);
// result: title
```
## caption
A human readable caption for the field. Can be used to achieve a consistent caption for a field throughout the app
#### example:
```ts
```
#### see:
FieldOptions#caption for configuration details
## dbName
The name of the column in the database that holds the data for this field. If no name is set, the key will be used instead.
#### example:
```ts
@Fields.string({ dbName: 'userName'})
userName=''
```
#### see:
FieldOptions#dbName for configuration details
## options
The options sent to this field's decorator
## inputType
The `inputType` relevant for this field, determined by the options sent to it's decorator and the valueConverter in these options
## allowNull
if null is allowed for this field
#### see:
FieldOptions#allowNull for configuration details
## target
The class that contains this field
#### example:
```ts
const taskRepo = remult.repo(Task);
Task == taskRepo.metadata.fields.title.target //will return true
```
## getDbName
* **getDbName**
## isServerExpression
Indicates if this field is based on a server express
## dbReadOnly
indicates that this field should only be included in select statement, and excluded from update or insert. useful for db generated ids etc...
#### see:
FieldOptions#dbReadOnly for configuration details
## valueConverter
the Value converter for this field
## displayValue
Get the display value for a specific item
#### see:
FieldOptions#displayValue for configuration details
#### example:
```ts
repo.fields.createDate.displayValue(task) //will display the date as defined in the `displayValue` option defined for it.
```
Arguments:
* **item**
## apiUpdateAllowed
Determines if the current user is allowed to update a specific entity instance.
#### example:
```ts
const taskRepo = remult.repo(Task);
// Check if the current user is allowed to update a specific task
if (taskRepo.metadata.apiUpdateAllowed(task)){
// Allow user to edit the entity
}
```
#### see:
FieldOptions#allowApiUpdate for configuration details
#### returns:
True if the update is allowed.
Arguments:
* **item** - Partial entity instance to check permissions against.
## includedInApi
Determines if a specific entity field should be included in the API based on the current user's permissions.
This method checks visibility permissions for a field within a partial entity instance.
#### example:
```ts
const employeeRepo = remult.repo(Employee);
// Determine if the 'salary' field of an employee should be visible in the API for the current user
if (employeeRepo.fields.salary.includedInApi({ id: 123, name: 'John Doe' })) {
// The salary field is included in the API
}
```
#### see:
FieldOptions#includeInApi for configuration details
#### returns:
True if the field is included in the API.
Arguments:
* **item** - The partial entity instance used to evaluate field visibility.
## toInput
Adapts the value for usage with html input
#### example:
```ts
@Fields.dateOnly()
birthDate = new Date(1976,5,16)
//...
input.value = repo.fields.birthDate.toInput(person) // will return '1976-06-16'
```
#### see:
ValueConverter#toInput for configuration details
Arguments:
* **value**
* **inputType**
## fromInput
Adapts the value for usage with html input
#### example:
```ts
@Fields.dateOnly()
birthDate = new Date(1976,5,16)
//...
person.birthDate = repo.fields.birthDate.fromInput(personFormState) // will return Date
```
#### see:
ValueConverter#fromInput for configuration details
Arguments:
* **inputValue**
* **inputType**
# API Reference - Allowed
# Allowed
Throughout the api you'll see methods that use the `Allowed` data type, for example `allowApiRead` etc...
The `Allowed` data type can be set to one of the following value:
- true/false
```ts
{
allowApiRead: true
}
```
- a Role - Checks if the current user has this role.
```ts
{
allowApiRead: 'admin'
}
```
or with a constant
```ts
{
allowApiRead: Roles.admin
}
```
- An Array of Roles - checks if the current user has at least one of the roles in the array
```ts
{
allowApiRead: [Roles.admin, Roles.productManager]
}
```
- A function that get's a `remult` object as a parameter and returns true or false
```ts
{ allowApiRead: Allow.authenticated } }
```
or:
```ts
{ allowApiRead: () => remult.user.name === 'superman' } }
```
# AllowedForInstance
In some cases, the allowed can be evaluated with regards to a specific instance, for example `allowApiUpdate` can consider specific row values.
The Allowed for Instance method accepts two parameters:
1. The relevant `remult` object
2. The relevant entity instance
For Example:
```ts
@Entity("tasks", {
allowApiUpdate: (task) => remult.isAllowed("admin") && !task!.completed
})
```
# API Reference - BackendMethod
# BackendMethod
Decorator indicating that the decorated method runs on the backend.
It allows the method to be invoked from the frontend while ensuring that the execution happens on the server side.
By default, the method runs within a database transaction, meaning it will either complete entirely or fail without making any partial changes.
This behavior can be controlled using the `transactional` option in the `BackendMethodOptions`.
For more details, see: [Backend Methods](https://remult.dev/docs/backendMethods.html).
#### example:
```typescript
@BackendMethod({ allowed: true })
async someBackendMethod() {
// method logic here
}
```
## allowed
Determines when this `BackendMethod` can execute, see: [Allowed](https://remult.dev/docs/allowed.html)
## apiPrefix
Used to determine the route for the BackendMethod.
#### example:
```ts
{allowed:true, apiPrefix:'someFolder/'}
```
## transactional
Controls whether this `BackendMethod` runs within a database transaction. If set to `true`, the method will either complete entirely or fail without making any partial changes. If set to `false`, the method will not be transactional and may result in partial changes if it fails.
#### default:
```ts
true
```
#### example:
```ts
{allowed: true, transactional: false}
```
## queue
EXPERIMENTAL: Determines if this method should be queued for later execution
## blockUser
EXPERIMENTAL: Determines if the user should be blocked while this `BackendMethod` is running
## paramTypes
* **paramTypes**
# API Reference - QueryResult
# QueryResult
The result of a call to the `query` method in the `Repository` object.
## [asyncIterator]
returns an iterator that iterates the rows in the result using a paging mechanism
#### example:
```ts
for await (const task of taskRepo.query()) {
await taskRepo.save({ ...task, completed });
}
```
## count
returns the number of rows that match the query criteria
## getPage
gets the items in a specific page
Arguments:
* **pageNumber**
## forEach
Performs an operation on all the items matching the query criteria
Arguments:
* **what**
## paginator
Returns a `Paginator` object that is used for efficient paging
# API Reference - Paginator
# Paginator
An interface used to paginating using the `query` method in the `Repository` object
#### example:
```ts
```
#### example:
```ts
const query = taskRepo.query({
where: { completed: false },
pageSize: 100,
})
const count = await query.count()
console.log('Paged: ' + count / 100)
let paginator = await query.paginator()
console.log(paginator.items.length)
if (paginator.hasNextPage) {
paginator = await paginator.nextPage()
console.log(paginator.items.length)
}
```
## items
the items in the current page
## hasNextPage
True if next page exists
## count
the count of the total items in the `query`'s result
## nextPage
Gets the next page in the `query`'s result set
# API Reference - LiveQuery
# LiveQuery
The `LiveQuery` interface represents a live query that allows subscribing to changes in the query results.
## subscribe
Subscribes to changes in the live query results.
#### returns:
A function that can be used to unsubscribe from the live query.
#### example:
```ts
// Subscribing to changes in a live query
const unsubscribe = taskRepo
.liveQuery({
limit: 20,
orderBy: { createdAt: 'asc' }
//where: { completed: true },
})
.subscribe(info => setTasks(info.applyChanges));
// Later, to unsubscribe
unsubscribe();
```
Arguments:
* **next** - A function that will be called with information about changes in the query results.
# API Reference - LiveQueryChangeInfo
# LiveQueryChangeInfo
The `LiveQueryChangeInfo` interface represents information about changes in the results of a live query.
## items
The updated array of result items.
## changes
The changes received in the specific message. The change types can be "all" (replace all), "add", "replace", or "remove".
## applyChanges
Applies the changes received in the message to an existing array. This method is particularly useful with React
to update the component's state based on the live query changes.
#### returns:
The updated array of result items after applying the changes.
#### example:
```ts
// Using applyChanges in a React component with useEffect hook
useEffect(() => {
return taskRepo
.liveQuery({
limit: 20,
orderBy: { createdAt: 'asc' }
//where: { completed: true },
})
.subscribe(info => setTasks(info.applyChanges));
}, []);
```
Arguments:
* **prevState** - The previous state of the array of result items.
# API Reference - Filter
# Filter
The `Filter` class is a helper class that focuses on filter-related concerns. It provides methods
for creating and applying filters in queries.
## getPreciseValues
Retrieves precise values for each property in a filter for an entity.
#### returns:
A promise that resolves to a FilterPreciseValues object containing the precise values for each property.
#### example:
```ts
const preciseValues = await Filter.getPreciseValues(meta, {
status: { $ne: 'active' },
$or: [
{ customerId: ["1", "2"] },
{ customerId: "3" }
]
});
console.log(preciseValues);
// Output:
// {
// "customerId": ["1", "2", "3"], // Precise values inferred from the filter
// "status": undefined, // Cannot infer precise values for 'status'
// }
```
Arguments:
* **metadata** - The metadata of the entity being filtered.
* **filter** - The filter to analyze.
## getPreciseValues
Retrieves precise values for each property in a filter for an entity.
#### returns:
A promise that resolves to a FilterPreciseValues object containing the precise values for each property.
#### example:
```ts
const preciseValues = await where.getPreciseValues();
console.log(preciseValues);
// Output:
// {
// "customerId": ["1", "2", "3"], // Precise values inferred from the filter
// "status": undefined, // Cannot infer precise values for 'status'
// }
```
## createCustom
Creates a custom filter. Custom filters are evaluated on the backend, ensuring security and efficiency.
When the filter is used in the frontend, only its name is sent to the backend via the API,
where the filter gets translated and applied in a safe manner.
#### returns:
A function that returns an `EntityFilter` of type `entityType`.
#### example:
```ts
class Order {
//...
static activeOrdersFor = Filter.createCustom(
async ({ year }) => {
return {
status: ['created', 'confirmed', 'pending', 'blocked', 'delayed'],
createdAt: {
$gte: new Date(year, 0, 1),
$lt: new Date(year + 1, 0, 1),
},
}
},
)
}
// Usage
await repo(Order).find({
where: Order.activeOrders({ year }),
})
```
#### see:
[Sql filter and Custom filter](/docs/custom-filter.html)
[Filtering and Relations](/docs/filtering-and-relations.html)
Arguments:
* **translator** - A function that returns an `EntityFilter`.
* **key** - An optional unique identifier for the custom filter.
## entityFilterToJson
Translates an `EntityFilter` to a plain JSON object that can be stored or transported.
#### returns:
A plain JSON object representing the `EntityFilter`.
#### example:
```ts
// Assuming `Task` is an entity class
const jsonFilter = Filter.entityFilterToJson(Task, { completed: true });
// `jsonFilter` can now be stored or transported as JSON
```
Arguments:
* **entityDefs** - The metadata of the entity associated with the filter.
* **where** - The `EntityFilter` to be translated.
## entityFilterFromJson
Translates a plain JSON object back into an `EntityFilter`.
#### returns:
The reconstructed `EntityFilter`.
#### example:
```ts
// Assuming `Task` is an entity class and `jsonFilter` is a JSON object representing an EntityFilter
const taskFilter = Filter.entityFilterFromJson(Task, jsonFilter);
// Using the reconstructed `EntityFilter` in a query
const tasks = await remult.repo(Task).find({ where: taskFilter });
for (const task of tasks) {
// Do something for each task based on the filter
}
```
Arguments:
* **entityDefs** - The metadata of the entity associated with the filter.
* **packed** - The plain JSON object representing the `EntityFilter`.
## fromEntityFilter
Converts an `EntityFilter` to a `Filter` that can be used by the `DataProvider`. This method is
mainly used internally.
#### returns:
A `Filter` instance that can be used by the `DataProvider`.
#### example:
```ts
// Assuming `Task` is an entity class and `taskFilter` is an EntityFilter
const filter = Filter.fromEntityFilter(Task, taskFilter);
// `filter` can now be used with the DataProvider
```
Arguments:
* **entity** - The metadata of the entity associated with the filter.
* **whereItem** - The `EntityFilter` to be converted.
## constructor
* **new Filter**
Arguments:
* **apply**
## resolve
Resolves an entity filter.
This method takes a filter which can be either an instance of `EntityFilter`
or a function that returns an instance of `EntityFilter` or a promise that
resolves to an instance of `EntityFilter`. It then resolves the filter if it
is a function and returns the resulting `EntityFilter`.
#### returns:
The resolved entity filter.
Arguments:
* **filter** - The filter to resolve.
## toJson
* **toJson**
# API Reference - Sort
# Sort
The `Sort` class is used to describe sorting criteria for queries. It is mainly used internally,
but it provides a few useful functions for working with sorting.
## toEntityOrderBy
Translates the current `Sort` instance into an `EntityOrderBy` object.
#### returns:
An `EntityOrderBy` object representing the sort criteria.
## constructor
Constructs a `Sort` instance with the provided sort segments.
Arguments:
* **segments** - The sort segments to be included in the sort criteria.
## Segments
The segments of the sort criteria.
## reverse
Reverses the sort order of the current sort criteria.
#### returns:
A new `Sort` instance with the reversed sort order.
## compare
Compares two objects based on the current sort criteria.
#### returns:
A negative value if `a` should come before `b`, a positive value if `a` should come after `b`, or zero if they are equal.
Arguments:
* **a** - The first object to compare.
* **b** - The second object to compare.
* **getFieldKey** - An optional function to get the field key for comparison.
## translateOrderByToSort
Translates an `EntityOrderBy` to a `Sort` instance.
#### returns:
A `Sort` instance representing the translated order by.
Arguments:
* **entityDefs** - The metadata of the entity associated with the order by.
* **orderBy** - The `EntityOrderBy` to be translated.
## createUniqueSort
Creates a unique `Sort` instance based on the provided `Sort` and the entity metadata.
This ensures that the sort criteria result in a unique ordering of entities.
#### returns:
A `Sort` instance representing the unique sort criteria.
Arguments:
* **entityMetadata** - The metadata of the entity associated with the sort.
* **orderBy** - The `Sort` instance to be made unique.
## createUniqueEntityOrderBy
Creates a unique `EntityOrderBy` based on the provided `EntityOrderBy` and the entity metadata.
This ensures that the order by criteria result in a unique ordering of entities.
#### returns:
An `EntityOrderBy` representing the unique order by criteria.
Arguments:
* **entityMetadata** - The metadata of the entity associated with the order by.
* **orderBy** - The `EntityOrderBy` to be made unique.
# API Reference - SqlDatabase
# SqlDatabase
A DataProvider for Sql Databases
#### example:
```ts
const db = new SqlDatabase(new PostgresDataProvider(pgPool))
```
#### see:
[Connecting a Database](https://remult.dev/docs/quickstart#connecting-a-database)
## getDb
Gets the SQL database from the data provider.
#### returns:
The SQL database.
#### see:
[Direct Database Access](https://remult.dev/docs/running-sql-on-the-server)
Arguments:
* **dataProvider** - The data provider.
## createCommand
Creates a new SQL command.
#### returns:
The SQL command.
#### see:
[Direct Database Access](https://remult.dev/docs/running-sql-on-the-server)
## execute
Executes a SQL command.
#### returns:
The SQL result.
#### see:
[Direct Database Access](https://remult.dev/docs/running-sql-on-the-server)
Arguments:
* **sql** - The SQL command.
## wrapIdentifier
Wraps an identifier with the database's identifier syntax.
## ensureSchema
* **ensureSchema**
Arguments:
* **entities**
## getEntityDataProvider
Gets the entity data provider.
#### returns:
The entity data provider.
Arguments:
* **entity** - The entity metadata.
## transaction
Runs a transaction. Used internally by remult when transactions are required
#### returns:
The promise of the transaction.
Arguments:
* **action** - The action to run in the transaction.
## rawFilter
Creates a raw filter for entity filtering.
#### returns:
- The entity filter with a custom SQL filter.
#### example:
```ts
SqlDatabase.rawFilter(({param}) =>
`"customerId" in (select id from customers where city = ${param(customerCity)})`
)
```
#### see:
[Leveraging Database Capabilities with Raw SQL in Custom Filters](https://remult.dev/docs/custom-filter.html#leveraging-database-capabilities-with-raw-sql-in-custom-filters)
Arguments:
* **build** - The custom SQL filter builder function.
## filterToRaw
Converts a filter to a raw SQL string.
#### see:
[Leveraging Database Capabilities with Raw SQL in Custom Filters](https://remult.dev/docs/running-sql-on-the-server#leveraging-entityfilter-for-sql-databases)
Arguments:
* **repo**
* **condition**
* **sqlCommand**
* **dbNames**
* **wrapIdentifier**
## LogToConsole
`false` _(default)_ - No logging
`true` - to log all queries to the console
`oneLiner` - to log all queries to the console as one line
a `function` - to log all queries to the console as a custom format
#### example:
```ts
SqlDatabase.LogToConsole = (duration, query, args) => { console.log("be crazy ;)") }
```
## durationThreshold
Threshold in milliseconds for logging queries to the console.
## constructor
Creates a new SQL database.
#### example:
```ts
const db = new SqlDatabase(new PostgresDataProvider(pgPool))
```
Arguments:
* **sql** - The SQL implementation.
## end
# API Reference - SubscriptionChannel
# SubscriptionChannel
The `SubscriptionChannel` class is used to send messages from the backend to the frontend,
using the same mechanism used by live queries.
#### example:
```ts
// Defined in code that is shared between the frontend and the backend
const statusChange = new SubscriptionChannel<{ oldStatus: number, newStatus: number }>("statusChange");
// Backend: Publishing a message
statusChange.publish({ oldStatus: 1, newStatus: 2 });
// Frontend: Subscribing to messages
statusChange.subscribe((message) => {
console.log(`Status changed from ${message.oldStatus} to ${message.newStatus}`);
});
// Note: If you want to publish from the frontend, use a BackendMethod for that.
```
## constructor
Constructs a new `SubscriptionChannel` instance.
Arguments:
* **channelKey** - The key that identifies the channel.
## channelKey
The key that identifies the channel.
## publish
Publishes a message to the channel. This method should only be used on the backend.
Arguments:
* **message** - The message to be published.
* **remult** - An optional instance of Remult to use for publishing the message.
## subscribe
Subscribes to messages from the channel. This method should only be used on the frontend.
#### returns:
A promise that resolves to a function that can be used to unsubscribe from the channel.
Arguments:
* **next** - A function that will be called with each message received.
* **remult** - An optional instance of Remult to use for the subscription.
# API Reference - generateMigrations
# generateMigrations
Generates migration scripts based on changes in entities.
#### see:
[Migrations](https://remult.dev/docs/migrations.html)
Arguments:
* **options** - Configuration options for generating migrations.
* **entities** - An array of entity classes whose changes will be included in the migration.
* **dataProvider** - The data provider instance or a function returning a promise of the data provider.
* **migrationsFolder** - (Optional) The path to the folder where migration scripts will be stored. Default is 'src/migrations'.
* **snapshotFile** - (Optional) The path to the file where the snapshot of the last known state will be stored. Default is 'migrations-snapshot.json' in the `migrationsFolder`.
* **migrationsTSFile** - (Optional) The path to the TypeScript file where the generated migrations will be written. Default is 'migrations.ts' in the `migrationsFolder`.
* **endConnection** - (Optional) Determines whether to close the database connection after generating migrations. Default is false.
# API Reference - migrate
# migrate
Applies migration scripts to update the database schema.
#### see:
[Migrations](https://remult.dev/docs/migrations.html)
Arguments:
* **options** - Configuration options for applying migrations.
* **migrations** - An object containing the migration scripts, each keyed by a unique identifier.
* **dataProvider** - The data provider instance or a function returning a promise of the data provider.
* **migrationsTable** - (Optional) The name of the table that tracks applied migrations. Default is '__remult_migrations_version'.
* **endConnection** - (Optional) Determines whether to close the database connection after applying migrations. Default is false.
* **beforeMigration** - (Optional) A callback function that is called before each migration is applied. Receives an object with the migration index.
* **afterMigration** - (Optional) A callback function that is called after each migration is applied. Receives an object with the migration index and the duration of the migration.
# API Reference - REST API Spec
# Entity Rest Api Breakdown
All entities automatically expose a rest API based on the parameters defined in its decorator.
The API supports the following actions (we'll use the `products` entity as an example, and a specific product with an id=7):
| Http Method | Description | example | requires |
| ----------- | ----------------------------------------------------------------------------------- | --------------- | -------------- |
| GET | returns an array of rows | /api/products | allowApiRead |
| GET | returns a single row based on its id | /api/products/7 | allowApiRead |
| POST | creates a new row based on the object sent in the body, and returns the new row | /api/products | allowApiInsert |
| PUT | updates an existing row based on the object sent in the body and returns the result | /api/products/7 | allowApiUpdate |
| DELETE | deletes an existing row | /api/products/7 | allowApiDelete |
## Sort
Add \_sort and \_order (ascending order by default)
```
https://mySite.com/api/products?_sort=price&_order=desc
```
## Filter
You can filter the rows using different operators
```
https://mySite.com/api/products?price.gte=5&price.lte=10
```
### Filter Operators
| operator | description | example |
| ------------ | --------------------- | -------------------------------------------------- |
| `none` | Equal To | price=10 |
| .ne | Not Equal | price.ne=10 |
| .in | is in json array | price.in=%5B10%2C20%5D _(url encoded - `[10,20]`)_ |
| .contains | Contains a string | name.contains=ee |
| .notContains | Not contains a string | name.notContains=ee |
| .startsWith | Starts with a string | name.startsWith=ee |
| .endsWith | Ends with a string | name.endsWith=ee |
| .gt | Greater than | price.gt=10 |
| .gte | Greater than or equal | price.gte=10 |
| .lt | Lesser than | price.lt=10 |
| .lte | Lesser than or equal | price.lte=10 |
| .null | is or is not null | price.null=true |
- you can add several filter conditions using the `&` operator.
### Count
```
https://mySite.com/api/products?price.gte=10&__action=count
```
returns:
```JSON
{
"count": 4
}
```
## Paginate
The default page size is 100 rows.
```
https://mySite.com/api/products?_limit=25
```
```
https://mySite.com/api/products?_limit=5&_page=3
```
:::tip
You can use it all in conjunction:
```
https://mySite.com/api/products?price.gte=5&price.lte=10&_sort=price&_order=desc&_limit=5&_page=3
```
:::
# Active Record & Mutable - EntityBase
# EntityBase
* **EntityBase**
## constructor
* **new EntityBase**
## $
* **$**
## _
* **_**
## assign
* **assign**
Arguments:
* **values**
## delete
* **delete**
## isNew
* **isNew**
## save
* **save**
# Active Record & Mutable - IdEntity
# IdEntity
* **IdEntity**
## constructor
* **new IdEntity**
## id
* **id**
## $
* **$**
## _
* **_**
## assign
* **assign**
Arguments:
* **values**
## delete
* **delete**
## isNew
* **isNew**
## save
* **save**
# Active Record & Mutable - EntityRef
# EntityRef
* **EntityRef**
## hasErrors
* **hasErrors**
## undoChanges
* **undoChanges**
## save
* **save**
## reload
* **reload**
## delete
* **delete**
## isNew
* **isNew**
## wasChanged
* **wasChanged**
## wasDeleted
* **wasDeleted**
## getId
* **getId**
## getOriginalId
* **getOriginalId**
## toApiJson
* **toApiJson**
## validate
* **validate**
## clone
* **clone**
## subscribe
* **subscribe**
Arguments:
* **listener**
## error
* **error**
## repository
* **repository**
## metadata
* **metadata**
## apiUpdateAllowed
* **apiUpdateAllowed**
## apiDeleteAllowed
* **apiDeleteAllowed**
## apiInsertAllowed
* **apiInsertAllowed**
## isLoading
* **isLoading**
## fields
* **fields**
## relations
* **relations**
# Active Record & Mutable - FieldRef
# FieldRef
* **FieldRef**
## subscribe
* **subscribe**
Arguments:
* **listener**
## valueChanged
* **valueChanged**
## load
Loads the related value - returns null if the related value is not found
## valueIsNull
* **valueIsNull**
## originalValueIsNull
* **originalValueIsNull**
## validate
* **validate**
## error
* **error**
## displayValue
* **displayValue**
## value
* **value**
## originalValue
* **originalValue**
## inputValue
* **inputValue**
## entityRef
* **entityRef**
## container
* **container**
## metadata
* **metadata**
# Active Record & Mutable - getEntityRef
# getEntityRef
Retrieves the EntityRef object associated with the specified entity instance.
The EntityRef provides methods for performing operations on the entity instance.
#### returns:
The EntityRef object associated with the specified entity instance.
#### throws:
If throwException is true and the EntityRef object cannot be retrieved.
#### see:
[Active Record & EntityBase](https://remult.dev/docs/active-record)
Arguments:
* **entity** - The entity instance.
* **throwException** - Indicates whether to throw an exception if the EntityRef object cannot be retrieved.
# Active Record & Mutable - getFields
# getFields
* **getFields**
Arguments:
* **container**
* **remult**
# Remult Blog - Introducing Remult
---
title: "Introducing Remult: The Backend to Frontend Framework You Always Wanted"
sidebar: false
editLink: false
---
> March 14, 2023 | Noam Honig
# Introducing Remult: The Backend to Frontend Framework You Always Wanted
Application developers, as their name implies, like to develop applications––they ultimately care very little about frontend vs. backend, and really just want to deliver value to users. Being an application developer myself, and very much like other application developers, one of the things that constantly drives my decision making when selecting tools and frameworks is the fact that I’m also quite lazy.
My main objective is to be able to ship applications with as little effort as possible, and my pet peeve is silly repetitive, mechanical tasks that make me die a little inside every time I need to perform them. For example, I don’t like to have to remember to align things that don’t automatically align themselves, as one common example of a repetitive task that is completely superfluous. I guess my trademark is that when I encounter challenges, I will always look for ways to automate a solution. (Like the app I once built in the 90s to send a romantic text message to my girlfriend once a day––to overcome my own romantic shortcomings).
I've been building tools and frameworks to increase developer productivity most of my professional career. This all started back in 2006, when I identified a need for modernizing applications created using a low-code generator, to C#.NET. I decided to not only create a fully automated migration solution but more importantly, create a C# class library that would enable an equivalent speed of development that was previously made possible by using the low-code tool.
This was a very high bar, as developers that are used to working in low-code are also often the type that don’t really like the details or the bits and bytes of the internals. A developer who is happy to use low-code tooling is essentially only looking to write code that gets a specific task done. Being able to replicate this in a coding framework needed to provide as seamless and simple of an experience that they had until now––but with code.
Hundreds of organizations have used Firefly’s migration solution, from fortune 500s and governments to small software houses. The C# library provided is still in use today, enabling “low-code level”, highly productive application development, combined with the flexibility of code.
The ability to efficiently ship full applications with a simple library, to me, felt like something that had to be portable and replicated to the modern web. The wheels began to turn.
## What Brought Us Here
Like many other web developers, when Node.js came out, I was excited about the promise of backend to frontend that came with it, that finally we don’t have to learn two languages to build a full stack application. Just the sheer mental switch between languages––for me it was C# on the backend and then Javascript on the frontend, and this always came with its toll of friction and context switch that could even be considered debilitating to my dev workflow.
Even with the possibility Node.js presented of enabling Javascript on both ends, still not many developers chose to do so. All of the best practices and learning resources continued to recommend using two separate approaches for backend and frontend, and a complete separation of tools, libraries and frameworks with very little sharing of code and patterns for frontend and backend.
This modus operandi of having two separate languages each with its own syntax and logic, creates the need for a lot of repetitive boilerplate, such as code to pull the data from the database (for every single application entity), code to expose entity CRUD operations as an API, with four, five or even six different routes per entity, methods for these routes, and these all would again get duplicated and reused hundreds of times, and each time further complicated. You get the idea.
And this is just on the backend.
Now on the frontend you have reverse code for this, you have code that takes these JSON responses and rebuilds objects out of these, for you to be able use them on the frontend. Just trying to get data from the database to the users, but in the meantime, you need code to read the database, serialize the JSON, send it over a route, only to have to deserialize and query it on the frontend, just to get it in front of the user.
This is all mechanical code that practically does nothing. Where, eventually, all of this is repeatable.
Wait, there’s more. Every API route needs to be able to fetch data, provide some sort of server-side paging, sorting and filtering, delete, insert, and update, all of these very generic actions are repeated over and over by all developers building full stack applications all the time in millions of lines of code.
Now let’s talk about concerns that cross over from the frontend to the backend that get duplicated. Up until a couple of years ago the frontend and backend were at best sharing types, but there’s so much more to types than just strings or integers.
Commons questions like: how do you serialize these from JSON and then to JSON? How do you validate them? Today, validations on the frontend and backend are operations that are completely separate. Which begs the questions WHY? Why should I have to remember (as a lazy developer mind you) to have to perform two separate validations on the frontend and the backend?
![Duplicate boilerplate code](./introducing-remult-part-1/boilerplate.png)
There’s also a cultural aspect with this dichotomy between frontend and backend code, where there needs to be such impeccable communication and alignment between the developers, that is almost an impossible feat. At the end of the day, all of those places are places of friction where things can go wrong, with two completely different developers maintaining the code.
## Enter Remult
Remember when I said that when I encounter a challenge, my first course of action is to try and automate it away? I couldn’t fathom how in 2018 it still is not viable to be able to get the same code to run on the frontend and the backend. I started toying with this idea to see if I could truly make this possible, improve my own productivity (and hopefully for other developers too)––from validations, to typing, through authentication and authorization, all of the typical code that’s constantly being duplicated.
### The Remult Backstory
Remult started as a side project, without a name, and with a completely different (albeit noble) purpose. My wife volunteered at a food bank, and as a result, I too was volunteered to participate in distributing food parcels to the needy. One day, as I was driving to distribute the parcels, holding a paper with a list of addresses I found myself getting lost in places you don’t want to get lost, and I knew I had to build them an app to help volunteers navigate efficiently. I knew I could solve a lot of friction in the process of delivering food parcels to the needy through code––which is what I do best, and I wanted to focus on building the actual application and its capabilities, and not the pieces that hold it together.
So I built an application for inventory and distribution of our local food bank in Even Yehuda, an application they could use to generate distribution lists for volunteer couriers, and for the couriers to be able to navigate to and report back on delivery. I wrote the app, and at the same time the framework as well, the very framework I wanted to have when building web applications. One that would focus on the data flow from the backend database all the way through to the frontend framework (the framework of your choice––whether Angular, React, or Vue––it shouldn’t matter).
![Food Bank App](./introducing-remult-part-1/food-bank-app.jpg)
Instead of having to go through the entire process described above of serializing objects for every HTTP call on the frontend, and then reversing the entire process back into JSON from the backend to the frontend––this framework now made it possible to query on the frontend, using objects, and then automated the entire process from the frontend to the backend and back again. I finally had the framework I dreamed of that eliminates the need to write all of this boilerplate, repetitive, duct tape code over and over again.
With its growth and use, a colleague and I worked on the framework, invested in its ability to scale and its stability, improved its API that underwent several iterations, and added many features. The application built upon this frameworkIt was quickly adopted by other food banks around Israel who often encountered similar challenges with parcel delivery. Our application after its first year managed to help distribute 17,000 parcels from food banks around Israel. We were quite proud of this achievement––we started feeling like our framework could possibly withstand the test of scale, but we had no idea what was to come next..
## What COVID Taught us About Scale and Security
Then COVID hit––and lock downs cut people in need off from the entire world. Suddenly, the need to distribute food to the elderly and disabled skyrocketed. The demand grew from 17,000 parcels annually to 17,000 parcels a day. The app was then contributed for free to municipalities, NGOs and even the IDF’s Home Front to enable better inventory, allocation and distribution of parcels around Israel.
Once the application was adopted by the IDF, it also underwent a battery of security testing––cyber and penetration testing, which leveled up its security significantly. The backend to frontend framework, and the application built upon it that was supposed to be just an experiment, withstood the scale of half a million parcel distributions in 2020 alone, and since then has maintained a similar number and is only growing. During COVID it was adopted by multiple countries around the globe––from Australia, to the EU, USA, and South Africa––to respond to similar needs during the pandemic.
This is the backbone upon which Remult was built and battle tested, with all of this running on a $16-a-month Heroku server.
Once the pandemic was behind us, my co-creator and I realized we had learned a lot. We understood the framework was robust and could scale, was aligned with security best practices, and delivered the promise of democratizing the ability to build full stack applications without all of the known friction.
We wanted to share this with the world.
So we open sourced the framework, to enable other lazy developers like us to be able to invest their energy in writing excellent applications that deliver value to users, and not on repeatable, mechanical code that apparently actually **can be automated** and shared by the backend and frontend.
Check out [Remult](https://remult.dev), and if you like it, give it a star. Let us know what you’d like to see next, and also feel free to contribute to the project.
In our next post, we’ll do a roundup of tools in the ecosystem looking to solve similar challenges though different approaches, and unpack where Remult fits in, and what it’s optimized for.
# react - Tutorial - Setup
# Build a Full-Stack React Application
### Create a simple todo app with Remult using a React frontend
In this tutorial, we are going to create a simple app to manage a task list. We'll use `React` for the UI, `Node.js` + `Express.js` for the API server, and Remult as our full-stack CRUD framework. For deployment to production, we'll use [railway.app](https://railway.app/) and a `PostgreSQL` database.
By the end of the tutorial, you should have a basic understanding of Remult and how to use it to accelerate and simplify full stack app development.
::: tip Prefer Angular?
Check out the [Angular tutorial](../angular/).
:::
### Prefer an Interactive Online Learning Experience?
If you'd rather follow along with an interactive, online tutorial, [try our interactive tutorial here](https://learn.remult.dev). It provides a hands-on, guided approach to building the same full-stack todo app with React and Remult.
---
### Prerequisites
This tutorial assumes you are familiar with `TypeScript` and `React`.
Before you begin, make sure you have [Node.js](https://nodejs.org) and [git](https://git-scm.com/) installed.
# Setup for the Tutorial
This tutorial requires setting up a React project, an API server project, and a few lines of code to add Remult.
You can either **use a starter project** to speed things up, or go through the **step-by-step setup**.
## Option 1: Clone the Starter Project
1. Clone the _react-vite-express-starter_ repository from GitHub and install its dependencies.
```sh
git clone https://github.com/remult/react-vite-express-starter.git remult-react-todo
cd remult-react-todo
npm install
```
2. Open your IDE.
3. Open a terminal and run the `dev` npm script.
```sh
npm run dev
```
4. Open another terminal and run the `dev-node` npm script
```sh
npm run dev-node
```
The default "Vite + React" app main screen should be available at the default Vite dev server address [http://127.0.0.1:5173](http://127.0.0.1:5173).
At this point, our starter project is up and running. We are now ready to move to the [next step of the tutorial](./entities.md) and start creating the task list app.
## Option 2: Step-by-step Setup
### Create a React project using Vite
Create the new React project.
```sh
npm create -y vite@latest remult-react-todo -- --template react-ts
cd remult-react-todo
```
::: warning Run into issues scaffolding the Vite project?
See [Vite documentation](https://vitejs.dev/guide/#scaffolding-your-first-vite-project) for help.
:::
In this tutorial, we'll be using the root folder created by `Vite` as the root folder for our server project as well.
### Install required packages
We need `Express` to serve our app's API, and, of course, `Remult`. For development, we'll use [tsx](https://www.npmjs.com/package/tsx) to run the API server.
```sh
npm i express remult
npm i --save-dev @types/express tsx
```
### Create the API server project
The starter API server TypeScript project contains a single module that initializes `Express`, and begins listening for API requests.
1. Open your IDE.
2. Create a `server` folder under the `src/` folder created by Vite.
3. Create an `index.ts` file in the `src/server/` folder with the following code:
```ts [index.ts]
// src/server/index.ts
import express from 'express'
const app = express()
app.listen(3002, () => console.log('Server started'))
```
### Bootstrap Remult in the back-end
Remult is loaded in the back-end as an `Express middleware`.
1. Create an `api.ts` file in the `src/server/` folder with the following code:
```ts
// src/server/api.ts
import { remultExpress } from 'remult/remult-express'
export const api = remultExpress()
```
2. Add the highlighted code lines to register the middleware in the main server module `index.ts`.
```ts{4,7}
// src/server/index.ts
import express from "express"
import { api } from "./api.js"
const app = express()
app.use(api)
app.listen(3002, () => console.log("Server started"))
```
::: warning ESM
In this tutorial we will be using `esm` for the node.js server - that means that where ever we import a file we have to include the `.js` suffix for it as we did above in the `import { api } from "./api.js` statement
:::
### Final tweaks
Our full stack starter project is almost ready. Let's complete these final configurations.
#### Enable TypeScript decorators in Vite
Add the following entry to the `defineConfig` section of the `vite.config.ts` file to enable the use of decorators in the React app.
```ts{6-12}
// vite.config.ts
// ...
export default defineConfig({
plugins: [react()],
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
});
```
#### Create the server tsconfig file
In the root folder, create a TypeScript configuration file `tsconfig.server.json` for the server project.
```json
{
"compilerOptions": {
"experimentalDecorators": true,
"skipLibCheck": true,
"esModuleInterop": true,
"outDir": "dist",
"rootDir": "src",
"module": "nodenext"
},
"include": ["src/server/**/*", "src/shared/**/*"]
}
```
#### Proxy API requests from Vue DevServer (vite) to the API server
The react app created in this tutorial is intended to be served from the same domain as its API.
However, for development, the API server will be listening on `http://localhost:3002`, while the react app is served from the default `http://localhost:5173`.
We'll use the [proxy](https://vitejs.dev/config/#server-proxy) feature of Vite to divert all calls for `http://localhost:5173/api` to our dev API server.
Configure the proxy by adding the following entry to the `vite.config.ts` file:
```ts{6}
// vite.config.ts
//...
export default defineConfig({
plugins: [react()],
server: { proxy: { "/api": "http://localhost:3002" } },
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
});
```
### Run the app
1. Open a terminal and start the vite dev server.
```sh
npm run dev
```
2. Add an `npm` script named `dev-node` to start the dev API server in the `package.json`.
```json
// package.json
"dev-node": "tsx watch --tsconfig tsconfig.server.json src/server"
```
3. Open another terminal and start the `node` server
```sh
npm run dev-node
```
The server is now running and listening on port 3002. `tsx` is watching for file changes and will restart the server when code changes are saved.
The default "Vite + React" app main screen should be available at the default Vite dev server address [http://127.0.0.1:5173](http://127.0.0.1:5173).
### Remove React default styles
The react default styles won't fit our todo app. If you'd like a nice-looking app, replace the contents of `src/index.css` with [this CSS file](https://raw.githubusercontent.com/remult/react-vite-express-starter/master/src/index.css). Otherwise, you can simply **delete the contents of `src/index.css`**.
### Setup completed
At this point, our starter project is up and running. We are now ready to move to the [next step of the tutorial](./entities.md) and start creating the task list app.
# react - Tutorial - Entities
# Entities
Let's start coding the app by defining the `Task` entity class.
The `Task` entity class will be used:
- As a model class for client-side code
- As a model class for server-side code
- By `remult` to generate API endpoints, API queries, and database commands
The `Task` entity class we're creating will have an auto-generated `id` field, a `title` field, a `completed` field and an auto-generated `createdAt` field. The entity's API route ("tasks") will include endpoints for all `CRUD` operations.
## Define the Model
1. Create a `shared` folder under the `src` folder. This folder will contain code shared between frontend and backend.
2. Create a file `Task.ts` in the `src/shared/` folder, with the following code:
```ts
// src/shared/Task.ts
import { Entity, Fields } from 'remult'
@Entity('tasks', {
allowApiCrud: true,
})
export class Task {
@Fields.cuid()
id = ''
@Fields.string()
title = ''
@Fields.boolean()
completed = false
@Fields.createdAt()
createdAt?: Date
}
```
3. In the server's `api` module, register the `Task` entity with Remult by adding `entities: [Task]` to an `options` object you pass to the `remultExpress()` middleware:
```ts{4,7}
// src/server/api.ts
import { remultExpress } from "remult/remult-express"
import { Task } from "../shared/Task.js"
export const api = remultExpress({
entities: [Task]
})
```
::: warning ESM
In this tutorial we will be using `esm` for the node.js server - that means that where ever we import a file we have to include the `.js` suffix for it as we did above in the `import { Task } from "../shared/Task.js"` statement
:::
The [@Entity](../../docs/ref_entity.md) decorator tells Remult this class is an entity class. The decorator accepts a `key` argument (used to name the API route and as a default database collection/table name), and an `options` argument used to define entity-related properties and operations, discussed in the next sections of this tutorial.
To initially allow all CRUD operations for tasks, we set the option [allowApiCrud](../../docs/ref_entity.md#allowapicrud) to `true`.
The [@Fields.cuid](../../docs/field-types.md#fields-cuid) decorator tells Remult to automatically generate a short random id using the [cuid](https://github.com/paralleldrive/cuid) library. This value can't be changed after the entity is created.
The [@Fields.string](../../docs/field-types.md#fields-string) decorator tells Remult the `title` property is an entity data field of type `String`. This decorator is also used to define field-related properties and operations, discussed in the next sections of this tutorial and the same goes for `@Fields.boolean` and the `completed` property.
The [@Fields.createdAt](../../docs/field-types.md#fields-createdat) decorator tells Remult to automatically generate a `createdAt` field with the current date and time.
::: tip
For a complete list of supported field types, see the [Field Types](../../docs/field-types.md) section in the Remult documentation.
:::
## Test the API
Now that the `Task` entity is defined, we can start using the REST API to query and add a tasks.
1. Open a browser with the url: [http://localhost:3002/api/tasks](http://localhost:3002/api/tasks), and you'll see that you get an empty array.
2. Use `curl` to `POST` a new task - _Clean car_.
```sh
curl http://localhost:3002/api/tasks -d "{\"title\": \"Clean car\"}" -H "Content-Type: application/json"
```
3. Refresh the browser for the url: [http://localhost:3002/api/tasks](http://localhost:3002/api/tasks) and see that the array now contains one item.
4. Use `curl` to `POST` a few more tasks:
```sh
curl http://localhost:3002/api/tasks -d "[{\"title\": \"Read a book\"},{\"title\": \"Take a nap\", \"completed\":true },{\"title\": \"Pay bills\"},{\"title\": \"Do laundry\"}]" -H "Content-Type: application/json"
```
- Note that the `POST` endpoint can accept a single `Task` or an array of `Task`s.
5. Refresh the browser again, to see that the tasks were stored in the db.
::: warning Wait, where is the backend database?
While remult supports [many relational and non-relational databases](https://remult.dev/docs/databases.html), in this tutorial we start by storing entity data in a backend **JSON file**. Notice that a `db` folder has been created under the root folder, with a `tasks.json` file containing the created tasks.
:::
## Admin UI
### Enabling the Admin UI
Add the Admin UI to your React application by setting the `admin` option to `true` in the `remultExpress()`
::: code-group
```ts [src/server/api.ts]
import { remultExpress } from 'remult/remult-express'
import { Task } from '../shared/Task.js'
export const api = remultExpress({
entities: [Task],
admin: true, // Enable the Admin UI
})
```
:::
### Accessing and Using the Admin UI
Navigate to `http://localhost:5173/api/admin` to access the Admin UI. Here, you can perform CRUD operations on your entities, view their relationships via the Diagram entry, and ensure secure management with the same validations and authorizations as your application.
![Remult Admin](/remult-admin.png)
### Features
- **CRUD Operations**: Directly create, update, and delete tasks through the Admin UI.
- **Entity Diagram**: Visualize relationships between entities for better data structure understanding.
- **Security**: Operations are secure, adhering to application-defined rules.
## Display the Task List
Let's start developing the web app by displaying the list of existing tasks in a React component.
Replace the contents of `src/App.tsx` with the following code:
```tsx
// src/App.tsx
import { useEffect, useState } from 'react'
import { remult } from 'remult'
import { Task } from './shared/Task'
const taskRepo = remult.repo(Task)
export default function App() {
const [tasks, setTasks] = useState([])
useEffect(() => {
taskRepo.find().then(setTasks)
}, [])
return (
Todos
{tasks.map((task) => {
return (
{task.title}
)
})}
)
}
```
Here's a quick overview of the different parts of the code snippet:
- `taskRepo` is a Remult [Repository](../../docs/ref_repository.md) object used to fetch and create Task entity objects.
- `tasks` is a Task array React state to hold the list of tasks.
- React's useEffect hook is used to call the Remult [repository](../../docs/ref_repository.md)'s [find](../../docs/ref_repository.md#find) method to fetch tasks from the server, once when the React component is loaded.
After the browser refreshes, the list of tasks appears.
# react - Tutorial - Paging, Sorting and Filtering
# Paging, Sorting and Filtering
The RESTful API created by Remult supports **server-side paging, sorting, and filtering**. Let's use that to limit, sort and filter the list of tasks.
## Limit Number of Fetched Tasks
Since our database may eventually contain a lot of tasks, it make sense to use a **paging strategy** to limit the number of tasks retrieved in a single fetch from the back-end database.
Let's limit the number of fetched tasks to `20`.
In the `useEffect` hook, pass an `options` argument to the `find` method call and set its `limit` property to 20.
```ts{6}
// src/App.tsx
useEffect(() => {
taskRepo
.find({
limit: 20
})
.then(setTasks)
}, [])
```
There aren't enough tasks in the database for this change to have an immediate effect, but it will have one later on when we'll add more tasks.
::: tip
To query subsequent pages, use the [Repository.find()](../../docs/ref_repository.md#find) method's `page` option.
:::
## Sorting By Creation Date
We would like old tasks to appear first in the list, and new tasks to appear last. Let's sort the tasks by their `createdAt` field.
In the `useEffect` hook, set the `orderBy` property of the `find` method call's `option` argument to an object that contains the fields you want to sort by.
Use "asc" and "desc" to determine the sort order.
```ts{7}
// src/App.tsx
useEffect(() => {
taskRepo
.find({
limit: 20,
orderBy: { createdAt: "asc" }
})
.then(setTasks)
}, [])
```
## Server side Filtering
Remult supports sending filter rules to the server to query only the tasks that we need.
Adjust the `useEffect` hook to fetch only `completed` tasks.
```ts{8}
// src/App.tsx
useEffect(() => {
taskRepo
.find({
limit: 20,
orderBy: { createdAt: "asc" },
where: { completed: true }
})
.then(setTasks)
}, [])
```
::: warning Note
Because the `completed` field is of type `boolean`, the argument is **compile-time checked to be of the `boolean` type**. Settings the `completed` filter to `undefined` causes it to be ignored by Remult.
:::
Play with different filtering values, and eventually comment it out, since we do need all the tasks
```ts{8}
// src/App.tsx
useEffect(() => {
taskRepo
.find({
limit: 20,
orderBy: { createdAt: "asc" }
//where: { completed: true },
})
.then(setTasks)
}, [])
```
::: tip Learn more
Explore the reference for a [comprehensive list of filtering options](../../docs/entityFilter.md).
:::
# react - Tutorial - CRUD Operations
# CRUD Operations
## Adding new tasks
Now that we can see the list of tasks, it's time to add a few more.
Add the highlighted `newTaskTitle` state and `addTask` function the App Component
```ts{5-16}
// src/App.tsx
export default function App() {
const [tasks, setTasks] = useState([])
const [newTaskTitle, setNewTaskTitle] = useState("")
const addTask = async (e: FormEvent) => {
e.preventDefault()
try {
const newTask = await taskRepo.insert({ title: newTaskTitle })
setTasks([...tasks, newTask])
setNewTaskTitle("")
} catch (error: unknown) {
alert((error as { message: string }).message)
}
}
//...
```
- the call to `taskRepo.insert` will make a post request to the server, insert the new task to the `db`, and return the new `Task` object with all it's info (including the id generated by the database)
::: warning Import FormEvent
This code requires adding an import of `FormEvent` from `react`.
:::
Next let's adjust the `tsx` to display a form to add new tasks
```tsx{7-14}
// src/App.tsx
return (
Todos
{tasks.map(task => {
return (
{task.title}
)
})}
)
```
Try adding a few tasks to see how it works
## Mark Tasks as completed
Modify the contents of the `tasks.map` iteration within the `App` component to include the following `setCompleted` function and call it in the input's `onChange` event.
```tsx{5-6,8-9,16}
// src/App.tsx
{
tasks.map(task => {
const setTask = (value: Task) =>
setTasks(tasks => tasks.map(t => (t === task ? value : t)))
const setCompleted = async (completed: boolean) =>
setTask(await taskRepo.save({ ...task, completed }))
return (
setCompleted(e.target.checked)}
/>
{task.title}
)
})
}
```
- The `setTask` function is used to replace the state of the changed task in the `tasks` array
- The `taskRepo.save` method update the `task` to the server and returns the updated value
## Rename Tasks and Save them
To make the tasks in the list updatable, we'll bind the `tasks` React state to `input` elements and add a _Save_ button to save the changes to the backend database.
Modify the contents of the `tasks.map` iteration within the `App` component to include the following `setTitle` and `saveTask` functions and add an `input` and a save `button`.
```tsx{11,13-19,28-29}
// src/App.tsx
{
tasks.map(task => {
const setTask = (value: Task) =>
setTasks(tasks => tasks.map(t => (t === task ? value : t)))
const setCompleted = async (completed: boolean) =>
setTask(await taskRepo.save({ ...task, completed }))
const setTitle = (title: string) => setTask({ ...task, title })
const saveTask = async () => {
try {
setTask(await taskRepo.save(task))
} catch (error: unknown) {
alert((error as { message: string }).message)
}
}
return (
)
})
}
```
- The `setTitle` function, called from the `input`'s `onChange` event, saves the value from the `input` to the `tasks` state.
- The `saveTask` function, called from the `button`'s' `onClick`event, saves the `task` object to the backend.
Make some changes and refresh the browser to verify the backend database is updated.
::: tip Browser's Network tab
As you play with these `CRUD` capabilities, monitor the network tab and see that they are all translated to `rest` api calls.
:::
## Delete Tasks
Let's add a _Delete_ button next to the _Save_ button of each task in the list.
Add the highlighted `deleteTask` function and _Delete_ `button` Within the `tasks.map` iteration in the `return` section of the `App` component.
```tsx{21-28,39}
// src/App.tsx
{
tasks.map(task => {
const setTask = (value: Task) =>
setTasks(tasks => tasks.map(t => (t === task ? value : t)))
const setCompleted = async (completed: boolean) =>
setTask(await taskRepo.save({ ...task, completed }))
const setTitle = (title: string) => setTask({ ...task, title })
const saveTask = async () => {
try {
setTask(await taskRepo.save(task))
} catch (error: unknown) {
alert((error as { message: string }).message)
}
}
const deleteTask = async () => {
try {
await taskRepo.delete(task)
setTasks(tasks.filter(t => t !== task))
} catch (error: unknown) {
alert((error as { message: string }).message)
}
}
return (
)
})
}
```
# react - Tutorial - Validation
# Validation
Validating user entered data is usually required both on the client-side and on the server-side, often causing a violation of the [DRY](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself) design principle. **With Remult, validation code can be placed within the entity class, and Remult will run the validation logic on both the frontend and the relevant API requests.**
::: warning Handling validation errors
When a validation error occurs, Remult will throw an exception.
In this tutorial, [CRUD operations](crud.md) catch these exceptions, and alert the user.
We leave it to you to decide how to handle validation errors in your application.
:::
## Validate the Title Field
Task titles are required. Let's add a validity check for this rule.
1. In the `Task` entity class, modify the `Fields.string` decorator for the `title` field to include an object literal argument and set the object's `validate` property to `Validators.required`.
```ts{3-5}
// src/shared/Task.ts
@Fields.string({
validate: Validators.required
})
title = ""
```
::: warning Import Validators
This code requires adding an import of `Validators` from `remult`.
:::
::: warning Manual browser refresh required
For this change to take effect, you **must manually refresh the browser**.
:::
After the browser is refreshed, try creating a new `task` or saving an existing one with an empty title - the _"Should not be empty"_ error message is displayed.
### Implicit server-side validation
The validation code we've added is called by Remult on the server-side to validate any API calls attempting to modify the `title` field.
Try making the following `POST` http request to the `http://localhost:3002/api/tasks` API route, providing an invalid title.
```sh
curl -i http://localhost:3002/api/tasks -d "{\"title\": \"\"}" -H "Content-Type: application/json"
```
An http error is returned and the validation error text is included in the response body,
## Custom Validation
The `validate` property of the first argument of `Remult` field decorators can be set to an arrow function which will be called to validate input on both front-end and back-end.
Try something like this and see what happens:
```ts
// src/shared/Task.ts
@Fields.string({
validate: (task) => {
if (task.title.length < 3) throw "Too Short"
}
})
title = ""
```
::: warning Remove Validators Import
This code no longer requires the `Validators` import from 'remult' and it should be removed
:::
# react - Tutorial - Live Queries
# Live Queries
Our todo list app can have multiple users using it at the same time. However, changes made by one user are not seen by others unless they manually refresh the browser.
Let's add realtime multiplayer capabilities to this app.
## Realtime updated todo list
Let's switch from fetching Tasks once when the React component is loaded, and manually maintaining state for CRUD operations, to using a realtime updated live query subscription **for both initial data fetching and subsequent state changes**.
1. Modify the contents of the `useEffect` hook in the `App` component:
```ts{4-5,10}
// src/App.tsx
useEffect(() => {
return taskRepo
.liveQuery({
limit: 20,
orderBy: { createdAt: "asc" }
//where: { completed: true },
})
.subscribe(info => setTasks(info.applyChanges))
}, [])
```
Let's review the change:
- Instead of calling the `repository`'s `find` method we now call the `liveQuery` method to define the query, and then call its `subscribe` method to establish a subscription which will update the Tasks state in realtime.
- The `subscribe` method accepts a callback with an `info` object that has 3 members:
- `items` - an up to date list of items representing the current result - it's useful for readonly use cases.
- `applyChanges` - a method that receives an array and applies the changes to it - we send that method to the `setTasks` state function, to apply the changes to the existing `tasks` state.
- `changes` - a detailed list of changes that were received
- The `subscribe` method returns an `unsubscribe` function which we use as a return value for the `useEffect` hook, so that it'll be called when the component unmounts.
2. As all relevant CRUD operations (made by all users) will **immediately update the component's state**, we should remove the manual adding of new Tasks to the component's state:
```ts{7}
// src/App.tsx
const addTask = async (e: FormEvent) => {
e.preventDefault()
try {
// const newTask = await taskRepo.insert({ title: newTaskTitle }) <- Delete this line
await taskRepo.insert({ title: newTaskTitle }) // <- replace with this line
// setTasks([...tasks, newTask]) <-- this line is no longer needed
setNewTaskTitle("")
} catch (error: unknown) {
alert((error as { message: string }).message)
}
}
```
3. Optionally remove other redundant state changing code:
```tsx{11-12,18-19,28}
// src/App.tsx
//...
{
tasks.map(task => {
const setTask = (value: Task) =>
setTasks(tasks => tasks.map(t => (t === task ? value : t)))
const setCompleted = async (completed: boolean) =>
// setTask(await taskRepo.save({ ...task, completed })) <- Delete this line
await taskRepo.save({ ...task, completed }) // <- replace with this line
const setTitle = (title: string) => setTask({ ...task, title })
const saveTask = async () => {
try {
// setTask(await taskRepo.save(task)) <- Delete this line
await taskRepo.save(task) // <- replace with this line
} catch (error: unknown) {
alert((error as { message: string }).message)
}
}
const deleteTask = async () => {
try {
await taskRepo.delete(task)
// setTasks(tasks.filter(t => t !== task)) <- Delete this line
} catch (error: unknown) {
alert((error as { message: string }).message)
}
}
//...
})
}
```
Open the todo app in two (or more) browser windows/tabs, make some changes in one window and notice how the others are updated in realtime.
::: tip Under the hood
The default implementation of live-queries uses HTTP Server-Sent Events (SSE) to push realtime updates to clients, and stores live-query information in-memory.
For serverless environments _(or multi servers)_, live-query updates can be pushed using integration with third-party realtime providers, such as [Ably](https://ably.com/) (or others), and live-query information can be stored to any database supported by Remult.
:::
# react - Tutorial - Backend methods
# Backend methods
When performing operations on multiple entity objects, performance considerations may necessitate running them on the server. **With Remult, moving client-side logic to run on the server is a simple refactoring**.
## Set All Tasks as Un/completed
Let's add two buttons to the todo app: "Set all as completed" and "Set all as uncompleted".
1. Add a `setAllCompleted` async function to the `App` function component, which accepts a `completed` boolean argument and sets the value of the `completed` field of all the tasks accordingly.
```ts
// src/App.tsx
const setAllCompleted = async (completed: boolean) => {
for (const task of await taskRepo.find()) {
await taskRepo.save({ ...task, completed })
}
}
```
The `for` loop iterates the array of `Task` objects returned from the backend, and saves each task back to the backend with a modified value in the `completed` field.
2. Add the two buttons to the return section of the `App` component, just before the closing `` tag. Both of the buttons' `onClick` events will call the `setAllCompleted` method with the appropriate value of the `completed` argument.
```tsx
// src/App.tsx
```
Make sure the buttons are working as expected before moving on to the next step.
## Refactor from Front-end to Back-end
With the current state of the `setAllCompleted` function, each modified task being saved causes an API `PUT` request handled separately by the server. As the number of tasks in the todo list grows, this may become a performance issue.
A simple way to prevent this is to expose an API endpoint for `setAllCompleted` requests, and run the same logic on the server instead of the client.
1. Create a new `TasksController` class, in the `shared` folder, and refactor the `for` loop from the `setAllCompleted` function of the `App` function component into a new, `static`, `setAllCompleted` method in the `TasksController` class, which will run on the server.
```ts
// src/shared/TasksController.ts
import { BackendMethod, remult } from 'remult'
import { Task } from './Task.js'
export class TasksController {
@BackendMethod({ allowed: true })
static async setAllCompleted(completed: boolean) {
const taskRepo = remult.repo(Task)
for (const task of await taskRepo.find()) {
await taskRepo.save({ ...task, completed })
}
}
}
```
The `@BackendMethod` decorator tells Remult to expose the method as an API endpoint (the `allowed` property will be discussed later on in this tutorial).
**Unlike the front-end `Remult` object, the server implementation interacts directly with the database.**
2. Register `TasksController` by adding it to the `controllers` array of the `options` object passed to `remultExpress()`, in the server's `api` module:
```ts{4,8}
// src/server/api.ts
//...
import { TasksController } from "../shared/TasksController.js"
export const api = remultExpress({
//...
controllers: [TasksController]
})
```
3. Replace the `for` iteration in the `setAllCompleted` function of the `App` component with a call to the `setAllCompleted` method in the `TasksController`.
```tsx{4}
// src/App.tsx
const setAllCompleted = async (completed: boolean) => {
await TasksController.setAllCompleted(completed)
}
```
::: warning Import TasksController
Remember to add an import of `TasksController` in `App.tsx`.
:::
::: tip Note
With Remult backend methods, argument types are compile-time checked. :thumbsup:
:::
After the browser refreshed, the _"Set all..."_ buttons function exactly the same, but much faster.
# react - Tutorial - Authentication and Authorization
# Authentication and Authorization
Our todo app is nearly functionally complete, but it still doesn't fulfill a very basic requirement - that users should log in before they can view, create or modify tasks.
Remult provides a flexible mechanism that enables placing **code-based authorization rules** at various levels of the application's API. To maintain high code cohesion, **entity and field-level authorization code should be placed in entity classes**.
**Remult is completely unopinionated when it comes to user authentication.** You are free to use any kind of authentication mechanism, and only required to provide Remult with an object which implements the Remult `UserInfo` interface.
In this tutorial, we'll use `Express`'s [cookie-session](https://expressjs.com/en/resources/middleware/cookie-session.html) middleware to store an authenticated user's session within a cookie. The `user` property of the session will be set by the API server upon a successful simplistic sign-in (based on username without password).
## Tasks CRUD Requires Sign-in
This rule is implemented within the `Task` `@Entity` decorator, by modifying the value of the `allowApiCrud` property.
This property can be set to a function that accepts a `Remult` argument and returns a `boolean` value. Let's use the `Allow.authenticated` function from Remult.
```ts{4}
// src/shared/Task.ts
@Entity("tasks", {
allowApiCrud: Allow.authenticated
})
```
::: warning Import Allow
This code requires adding an import of `Allow` from `remult`.
:::
After the browser refreshes, **the list of tasks disappeared** and the user can no longer create new tasks.
::: details Inspect the HTTP error returned by the API using cURL
```sh
curl -i http://localhost:3002/api/tasks
```
:::
::: danger Authorized server-side code can still modify tasks
Although client CRUD requests to `tasks` API endpoints now require a signed-in user, the API endpoint created for our `setAllCompleted` server function remains available to unauthenticated requests. Since the `allowApiCrud` rule we implemented does not affect the server-side code's ability to use the `Task` entity class for performing database CRUD operations, **the `setAllCompleted` function still works as before**.
To fix this, let's implement the same rule using the `@BackendMethod` decorator of the `setAllCompleted` method of `TasksController`.
```ts
// src/shared/TasksController.ts
@BackendMethod({ allowed: Allow.authenticated })
```
**This code requires adding an import of `Allow` from `remult`.**
:::
## User Authentication
Let's add a sign-in area to the todo app, with an `input` for typing in a `username` and a sign-in `button`. The app will have two valid `username` values - _"Jane"_ and _"Steve"_. After a successful sign-in, the sign-in area will be replaced by a "Hi [username]" message.
### Backend setup
1. Open a terminal and run the following command to install the required packages:
```sh
npm i cookie-session
npm i --save-dev @types/cookie-session
```
2. Modify the main server module `index.ts` to use the `cookie-session` Express middleware.
```ts{5,8-12}
// src/server/index.ts
//...
import session from "cookie-session"
const app = express()
app.use(
session({
secret: process.env["SESSION_SECRET"] || "my secret"
})
)
//...
```
The `cookie-session` middleware stores session data, digitally signed using the value of the `secret` property, in an `httpOnly` cookie, sent by the browser to all subsequent API requests.
3. add a `shared/AuthController.ts` file and include the following code:
```ts add={4-6,8-12}
// src/shared/AuthController.ts
import { BackendMethod, remult } from 'remult'
import type express from 'express'
// eslint-disable-next-line @typescript-eslint/no-unused-vars
import type from 'cookie-session' // required to access the session member of the request object
declare module 'remult' {
export interface RemultContext {
request?: express.Request
}
}
export class AuthController {
//
}
```
### Code Explanation
- We import the necessary modules from `remult` and types for `express` and `cookie-session`.
- We extend the `RemultContext` interface to include an optional `request` property of type `express.Request`.
- Remult will automatically set the `request` with the current request. Since Remult works with any server framework, we need to type it to the correct server, which in this case is Express. This typing gives us access to the request object and its session, managed by `cookie-session`.
- This `request` can be accessed using `remult.context.request`.
Next, we'll add a static list of users and a sign-in method. (In a real application, you would use a database, but for this tutorial, a static list will suffice.)
```ts add={1,4-17}
const validUsers = [{ name: 'Jane' }, { name: 'Alex' }]
export class AuthController {
@BackendMethod({ allowed: true })
static async signIn(name: string) {
const user = validUsers.find((user) => user.name === name)
if (user) {
remult.user = {
id: user.name,
name: user.name,
}
remult.context.request!.session!['user'] = remult.user
return remult.user
} else {
throw Error("Invalid user, try 'Alex' or 'Jane'")
}
}
}
```
### Code Explanation
- We define a static list of valid users.
- The `signIn` method is decorated with `@BackendMethod({ allowed: true })`, making it accessible from the frontend.
- The method checks if the provided `name` exists in the `validUsers` list. If it does, it sets `remult.user` to an object that conforms to the `UserInfo` type from Remult and stores this user in the request session.
- If the user is not found, it throws an error.
Next, we'll add the sign-out method:
```ts add={7-11}
export class AuthController {
@BackendMethod({ allowed: true })
static async signIn(name: string) {
//...
}
@BackendMethod({ allowed: true })
static async signOut() {
remult.context.request!.session!['user'] = undefined
return undefined
}
}
```
- The `signOut` method clears the user session, making the user unauthenticated.
4. Update `remultExpress` configuration.
```ts{3,5,6}
// src/server/api.ts
import { AuthController } from '../shared/AuthController.js'
export const api = remultExpress({
//...
controllers: [TaskController, AuthController]
getUser: (req) => req.session!['user'],
})
```
### Code Explanation
- Register the `AuthController` so that the frontend can call its `signIn` and `signOut` methods
- `getUser` function: The getUser function is responsible for extracting the user information from the session. If a user is found in the session, Remult will treat the request as authenticated, and this user will be used for authorization purposes.
### Frontend setup
1. Create a file `src/Auth.tsx` and place the following `Auth` component code in it:
```ts
// src/Auth.tsx
import { FormEvent, useEffect, useState } from "react";
import { remult } from "remult";
import App from "./App";
import { AuthController } from "./shared/AuthController";
export default function Auth() {
const [username, setUsername] = useState("");
const [signedIn, setSignedIn] = useState(false);
async function signIn(e: FormEvent) {
e.preventDefault();
try {
remult.user = await AuthController.signIn(username);
setSignedIn(true);
} catch (error: unknown) {
alert((error as { message: string }).message);
}
}
async function signOut() {
await AuthController.signOut();
remult.user = undefined;
setSignedIn(false);
}
useEffect(() => {
remult.initUser().then(() => {
setSignedIn(remult.authenticated());
});
}, []);
if (!signedIn)
return (
<>
Todos
>
);
return (
<>
Hello {remult.user!.name}
>
);
}
```
2. In the `main.tsx` file, Replace the `App` component with the `Auth` component.
```ts{5,10}
// src/main.tsx
import React from "react"
import ReactDOM from "react-dom/client"
import Auth from "./Auth"
import "./index.css"
ReactDOM.createRoot(document.getElementById("root") as HTMLElement).render(
)
```
The todo app now supports signing in and out, with **all access restricted to signed in users only**.
## Role-based Authorization
Usually, not all application users have the same privileges. Let's define an `admin` role for our todo app, and enforce the following authorization rules:
- All signed in users can see the list of tasks.
- All signed in users can set specific tasks as `completed`.
- Only users belonging to the `admin` role can create, delete or edit the titles of tasks.
1. Modify the highlighted lines in the `Task` entity class to reflect the top three authorization rules.
```ts{7-8,18}
// src/shared/Task.ts
import { Allow, Entity, Fields, Validators } from "remult"
@Entity("tasks", {
allowApiCrud: Allow.authenticated,
allowApiInsert: "admin",
allowApiDelete: "admin"
})
export class Task {
@Fields.uuid()
id!: string
@Fields.string({
validate: (task) => {
if (task.title.length < 3) throw "Too Short"
}
allowApiUpdate: "admin"
})
title = ""
@Fields.boolean()
completed = false
}
```
2. Let's give the user _"Jane"_ the `admin` role by modifying the `roles` array of her `validUsers` entry.
```ts{3,13}
// src/shared/AuthController.ts
const validUsers = [{ name: "Jane", admin: true }, { name: "Steve" }];
export class AuthController {
@BackendMethod({ allowed: true })
static async signIn(name: string) {
const user = validUsers.find((user) => user.name === name);
if (user) {
remult.user = {
id: user.name,
name: user.name,
roles: user.admin ? ["admin"] : [],
};
remult.context.request!.session!["user"] = remult.user;
return remult.user;
} else {
throw Error("Invalid user, try 'Steve' or 'Jane'");
}
}
```
**Sign in to the app as _"Steve"_ to test that the actions restricted to `admin` users are not allowed. :lock:**
## Role-based Authorization on the Frontend
From a user experience perspective it only makes sense that users that can't add or delete, would not see these buttons.
Let's reuse the same definitions on the Frontend.
We'll use the entity's metadata to only show the form if the user is allowed to insert
```tsx{4,13}
// src/App.tsx
{taskRepo.metadata.apiInsertAllowed() && (
)}
...
```
And let's do the same for the `delete` button:
```tsx{12,14}
// src/App.tsx
return (
)
```
This way we can keep the frontend consistent with the `api`'s Authorization rules
- Note We send the `task` to the `apiDeleteAllowed` method, because the `apiDeleteAllowed` option, can be sophisticated and can also be based on the specific item's values.
# react - Tutorial - Database
# Database
Up until now the todo app has been using a plain JSON file to store the list of tasks. **In production, we'd like to use a `Postgres` database table instead.**
::: tip Learn more
See the [Quickstart](https://remult.dev/docs/quickstart.html#connecting-a-database) article for the (long) list of relational and non-relational databases Remult supports.
:::
::: warning Don't have Postgres installed? Don't have to.
Don't worry if you don't have Postgres installed locally. In the next step of the tutorial, we'll configure the app to use Postgres in production, and keep using JSON files in our dev environment.
**Simply install `postgres-node` per step 1 below and move on to the [Deployment section of the tutorial](deployment.md).**
:::
1. Install `postgres-node` ("pg").
```sh
npm i pg
```
2. Add the highlighted code to the `api` server module.
```ts{5,9-11}
// src/server/api.ts
//...
import { createPostgresDataProvider } from "remult/postgres"
export const api = remultExpress({
//...
dataProvider: createPostgresDataProvider({
connectionString: "your connection string"
})
})
```
# react - Tutorial - Deployment
# Deployment
Let's deploy the todo app to [railway.app](https://railway.app/).
## Prepare for Production
In this tutorial, we'll deploy both the React app and the API server as [one server-side app](https://create-react-app.dev/docs/deployment/#other-solutions), and redirect all non-API requests to return the React app.
We will deploy an ESM node server project
In addition, to follow a few basic production best practices, we'll use [compression](https://www.npmjs.com/package/compression) middleware to improve performance and [helmet](https://www.npmjs.com/package/helmet) middleware for security
1. Add the highlighted code lines to `src/server/index.ts`, and modify the `app.listen` function's `port` argument to prefer a port number provided by the production host's `PORT` environment variable.
```ts{16-21}
// src/server/index.ts
import express from "express"
import { api } from "./api.js"
import session from "cookie-session"
import { auth } from "./auth.js"
const app = express()
app.use(
session({
secret: process.env["SESSION_SECRET"] || "my secret"
})
)
app.use(auth)
app.use(api)
const frontendFiles = process.cwd() + "/dist";
app.use(express.static(frontendFiles));
app.get("/*", (_, res) => {
res.sendFile(frontendFiles + "/index.html");
});
app.listen(process.env["PORT"] || 3002, () => console.log("Server started"));
```
3. Modify the highlighted code in the api server module to prefer a `connectionString` provided by the production host's `DATABASE_URL` environment variable.
```ts{4,7-9}
// src/server/api.ts
//...
const DATABASE_URL = process.env["DATABASE_URL"];
export const api = remultExpress({
dataProvider: DATABASE_URL
? createPostgresDataProvider({ connectionString: DATABASE_URL })
: undefined,
//...
})
```
::: warning Note
In order to connect to a local PostgresDB, add `DATABASE_URL` to an .env file, or simply replace `process.env["DATABASE_URL"]` with your `connectionString`.
If no `DATABASE_URL` has found, it'll fallback to our local JSON files.
:::
4. Modify the project's `build` npm script to additionally transpile the API server's TypeScript code to JavaScript (using `tsc`).
```json
// package.json
"build": "tsc && vite build && tsc -p tsconfig.server.json",
```
5. Modify the project's `start` npm script to start the production Node.js server.
```json
// package.json
"start": "node dist/server/"
```
The todo app is now ready for deployment to production.
## Test Locally
To test the application locally run
```sh
npm run build
npm run start
```
::: warning Build Errors
If you get an error `error TS5096: Option 'allowImportingTsExtensions' can only be used when either 'noEmit' or 'emitDeclarationOnly' is set.` do not set the `emitDeclarationOnly` flag!
You are getting the error because somewhere in your code you've imported from `.ts` instead of `.js` - fix it and build again
:::
Now navigate to http://localhost:3002 and test the application locally
## Deploy to Railway
In order to deploy the todo app to [railway](https://railway.app/) you'll need a `railway` account. You'll also need [Railway CLI](https://docs.railway.app/develop/cli#npm) installed, and you'll need to login to railway from the cli, using `railway login`.
Click enter multiple times to answer all its questions with the default answer
1. Create a Railway `project`.
From the terminal in your project folder run:
```sh
railway init
```
2. Set a project name.
3. Once that's done run the following command to open the project on railway.dev:
```sh
railway open
```
4. Once that's done run the following command to upload the project to railway:
```sh
railway up
```
5. Add Postgres Database:
1. In the project on `railway.dev`, click `+ Create`
2. Select `Database`
3. Select `Add PostgresSQL`
6. Configure the environment variables
1. Click on the project card (not the Postgres one)
2. Switch to the `variables` tab
3. Click on `+ New Variable`, and in the `VARIABLE_NAME` click `Add Reference` and select `DATABASE_URL`
4. Add another variable called `SESSION_SECRET` and set it to a random string, you can use an [online UUID generator](https://www.uuidgenerator.net/)
5. Switch to the `settings` tab
6. Under `Environment` click on `Generate Domain`
7. Click on the `Deploy` button on the top left.
7. Once the deployment is complete -
8. Click on the newly generated url to open the app in the browser and you'll see the app live in production. (it may take a few minutes to go live)
::: warning Note
If you run into trouble deploying the app to Railway, try using Railway's [documentation](https://docs.railway.app/deploy/deployments).
:::
That's it - our application is deployed to production, play with it and enjoy.
To see a larger more complex code base, visit our [CRM example project](https://www.github.com/remult/crm-demo)
Love Remult? Give our repo a star.⭐
# angular - Tutorial - Setup
# Build a Full-Stack Angular Application
### Create a simple todo app with Remult using an Angular frontend
In this tutorial, we are going to create a simple app to manage a task list. We'll use `Angular` for the UI, `Node.js` + `Express.js` for the API server, and Remult as our full-stack CRUD framework. For deployment to production, we'll use [railway.app](https://railway.app/) and a `PostgreSQL` database.
By the end of the tutorial, you should have a basic understanding of Remult and how to use it to accelerate and simplify full stack app development.
::: tip Prefer React?
Check out the [React tutorial](../react/).
:::
### Prerequisites
This tutorial assumes you are familiar with `TypeScript` and `Angular`.
Before you begin, make sure you have [Node.js](https://nodejs.org) and [git](https://git-scm.com/) installed.
If Angular CLI is not already installed - then install it.
```sh
npm i -g @angular/cli
```
# Setup for the Tutorial
This tutorial requires setting up an Angular project, an API server project, and a few lines of code to add Remult.
You can either **use a starter project** to speed things up, or go through the **step-by-step setup**.
## Option 1: Clone the Starter Project
1. Clone the _angular-express-starter_ repository from GitHub and install its dependencies.
```sh
git clone https://github.com/remult/angular-express-starter.git remult-angular-todo
cd remult-angular-todo
npm install
```
2. Open your IDE.
3. Open a terminal and run the `dev` npm script.
```sh
npm run dev
```
The default Angular app main screen should be displayed.
At this point, our starter project is up and running. We are now ready to move to the [next step of the tutorial](./entities.md) and start creating the task list app.
## Option 2: Step-by-step Setup
### Create an Angular project
Create the new Angular project.
```sh
ng new remult-angular-todo
```
::: warning Note
The `ng new` command prompts you for information about features to include in the initial app project. Accept the defaults by pressing the Enter or Return key.
:::
In this tutorial, we'll be using the root folder created by `Angular` as the root folder for our server project as well.
```sh
cd remult-angular-todo
```
### Install required packages
We need `Express` to serve our app's API, and, of course, `Remult`. For development, we'll use [tsx](https://www.npmjs.com/package/tsx) to run the API server.
```sh
npm i express remult
npm i --save-dev @types/express tsx
```
### Create the API server project
The starter API server TypeScript project contains a single module that initializes `Express`, and begins listening for API requests.
1. Open your IDE.
2. Add the following entry to the `compilerOptions` section of the `tsconfig.json` file to enable the use of Synthetic Default Imports.
```json{7-8}
// tsconfig.json
{
...
"compilerOptions": {
...
"allowSyntheticDefaultImports": true,
...
}
...
}
```
2. Create a `server` folder under the `src/` folder created by Angular cli.
3. Create an `index.ts` file in the `src/server/` folder with the following code:
```ts
// src/server/index.ts
import express from 'express'
const app = express()
app.listen(3002, () => console.log('Server started'))
```
### Bootstrap Remult in the back-end
Remult is loaded in the back-end as an `Express middleware`.
1. Create an `api.ts` file in the `src/server/` folder with the following code:
```ts
// src/server/api.ts
import { remultExpress } from 'remult/remult-express'
export const api = remultExpress()
```
2. Add the highlighted code lines to register the middleware in the main server module `index.ts`.
```ts{4,7}
// src/server/index.ts
import express from "express"
import { api } from "./api"
const app = express()
app.use(api)
app.listen(3002, () => console.log("Server started"))
```
### Final tweaks
Our full stack starter project is almost ready. Let's complete these final configurations.
#### Proxy API requests from Angular DevServer to the API server
The Angular app created in this tutorial is intended to be served from the same domain as its API.
However, for development, the API server will be listening on `http://localhost:3002`, while the Angular dev server is served from the default `http://localhost:4200`.
We'll use the [proxy](https://angular.io/guide/build#proxying-to-a-backend-server) feature of Angular dev server to divert all calls for `http://localhost:4200/api` to our dev API server.
Create a file `proxy.conf.json` in the root folder, with the following contents:
```json
// proxy.conf.json
{
"/api": {
"target": "http://localhost:3002",
"secure": false
}
}
```
### Run the app
1. Add a script called `dev` that will run the angular `dev` server with the proxy configuration we've set and a script called `dev-node` to run the api.
```json
// package.json
"scripts": {
...
"dev": "ng serve --proxy-config proxy.conf.json --open",
"dev-node": "tsx watch src/server",
...
}
```
1. Open a terminal and start the angular dev server.
```sh
npm run dev
```
3. Open another terminal and start the `node` server
```sh
npm run dev-node
```
The server is now running and listening on port 3002. `tsx` is watching for file changes and will restart the server when code changes are saved.
The default Angular app main screen should be displayed on the regular port - 4200. Open it in the browser at [http://localhost:4200/](http://localhost:4200/).
### Remove Angular default styles
The angular default styles won't fit our todo app. If you'd like a nice-looking app, replace the contents of `src/styles.css` with [this CSS file](https://raw.githubusercontent.com/remult/angular-express-starter/master/src/styles.css). Otherwise, you can simply **delete the contents of `src/styles.css`**.
### Setup completed
At this point, our starter project is up and running. We are now ready to move to the [next step of the tutorial](./entities.md) and start creating the task list app.
# angular - Tutorial - Entities
# Entities
Let's start coding the app by defining the `Task` entity class.
The `Task` entity class will be used:
- As a model class for client-side code
- As a model class for server-side code
- By `remult` to generate API endpoints, API queries, and database commands
The `Task` entity class we're creating will have an auto-generated `id` field, a `title` field, a `completed` field and an auto-generated `createdAt` field. The entity's API route ("tasks") will include endpoints for all `CRUD` operations.
## Define the Model
1. Create a `shared` folder under the `src` folder. This folder will contain code shared between frontend and backend.
2. Create a file `Task.ts` in the `src/shared/` folder, with the following code:
```ts
// src/shared/Task.ts
import { Entity, Fields } from 'remult'
@Entity('tasks', {
allowApiCrud: true,
})
export class Task {
@Fields.cuid()
id = ''
@Fields.string()
title = ''
@Fields.boolean()
completed = false
@Fields.createdAt()
createdAt?: Date
}
```
3. In the server's `api` module, register the `Task` entity with Remult by adding `entities: [Task]` to an `options` object you pass to the `remultExpress()` middleware:
```ts{4,7}
// src/server/api.ts
import { remultExpress } from "remult/remult-express"
import { Task } from "../shared/Task"
export const api = remultExpress({
entities: [Task]
})
```
The [@Entity](../../docs/ref_entity.md) decorator tells Remult this class is an entity class. The decorator accepts a `key` argument (used to name the API route and as a default database collection/table name), and an `options` argument used to define entity-related properties and operations, discussed in the next sections of this tutorial.
To initially allow all CRUD operations for tasks, we set the option [allowApiCrud](../../docs/ref_entity.md#allowapicrud) to `true`.
The [@Fields.cuid](../../docs/field-types.md#fields-cuid) decorator tells Remult to automatically generate a short random id using the [cuid](https://github.com/paralleldrive/cuid) library. This value can't be changed after the entity is created.
The [@Fields.string](../../docs/field-types.md#fields-string) decorator tells Remult the `title` property is an entity data field of type `String`. This decorator is also used to define field-related properties and operations, discussed in the next sections of this tutorial and the same goes for `@Fields.boolean` and the `completed` property.
The [@Fields.createdAt](../../docs/field-types.md#fields-createdat) decorator tells Remult to automatically generate a `createdAt` field with the current date and time.
::: tip
For a complete list of supported field types, see the [Field Types](../../docs/field-types.md) section in the Remult documentation.
:::
## Test the API
Now that the `Task` entity is defined, we can start using the REST API to query and add a tasks.
1. Open a browser with the url: [http://localhost:3002/api/tasks](http://localhost:3002/api/tasks), and you'll see that you get an empty array.
2. Use `curl` to `POST` a new task - _Clean car_.
```sh
curl http://localhost:3002/api/tasks -d "{\"title\": \"Clean car\"}" -H "Content-Type: application/json"
```
3. Refresh the browser for the url: [http://localhost:3002/api/tasks](http://localhost:3002/api/tasks) and see that the array now contains one item.
4. Use `curl` to `POST` a few more tasks:
```sh
curl http://localhost:3002/api/tasks -d "[{\"title\": \"Read a book\"},{\"title\": \"Take a nap\", \"completed\":true },{\"title\": \"Pay bills\"},{\"title\": \"Do laundry\"}]" -H "Content-Type: application/json"
```
- Note that the `POST` endpoint can accept a single `Task` or an array of `Task`s.
5. Refresh the browser again, to see that the tasks were stored in the db.
::: warning Wait, where is the backend database?
While remult supports [many relational and non-relational databases](https://remult.dev/docs/databases.html), in this tutorial we start by storing entity data in a backend **JSON file**. Notice that a `db` folder has been created under the root folder, with a `tasks.json` file containing the created tasks.
:::
## Admin UI
### Enabling the Admin UI
Add the Admin UI to your Angular application by setting the `admin` option to `true` in the `remultExpress()`
::: code-group
```ts [src/server/api.ts]
import { remultExpress } from 'remult/remult-express'
import { Task } from '../shared/Task.js'
export const api = remultExpress({
entities: [Task],
admin: true, // Enable the Admin UI
})
```
:::
### Accessing and Using the Admin UI
Navigate to `http://localhost:3002/api/admin` to access the Admin UI. Here, you can perform CRUD operations on your entities, view their relationships via the Diagram entry, and ensure secure management with the same validations and authorizations as your application.
![Remult Admin](/remult-admin.png)
### Features
- **CRUD Operations**: Directly create, update, and delete tasks through the Admin UI.
- **Entity Diagram**: Visualize relationships between entities for better data structure understanding.
- **Security**: Operations are secure, adhering to application-defined rules.
## Display the Task List
Let's start developing the web app by displaying the list of existing tasks in an Angular component.
1. Create a `Todo` component using Angular's cli
```sh
ng g c todo
```
2. Import the new component in the `app.component.ts`
```ts{8,18}
//src/app/app.component.ts
import { Component } from '@angular/core';
import { CommonModule } from '@angular/common';
import { RouterOutlet } from '@angular/router';
import { HttpClientModule } from '@angular/common/http';
import { FormsModule } from '@angular/forms';
import { TodoComponent } from './todo/todo.component';
@Component({
selector: 'app-root',
standalone: true,
imports: [
CommonModule,
RouterOutlet,
HttpClientModule,
FormsModule,
TodoComponent,
],
templateUrl: './app.component.html',
styleUrl: './app.component.css',
})
export class AppComponent {
title = 'remult-angular-todo';
}
```
3. Replace the `app.components.html` to use the `todo` component.
```html
```
4. Add the highlighted code lines to the `TodoComponent` class file:
```ts{5-7,12,,17-21}
// src/app/todo/todo.component.ts
import { Component } from '@angular/core';
import { CommonModule } from '@angular/common';
import { FormsModule } from '@angular/forms';
import { remult } from 'remult';
import { Task } from '../../shared/Task';
@Component({
selector: 'app-todo',
standalone: true,
imports: [CommonModule, FormsModule],
templateUrl: './todo.component.html',
styleUrl: './todo.component.css',
})
export class TodoComponent {
taskRepo = remult.repo(Task);
tasks: Task[] = [];
ngOnInit() {
this.taskRepo.find().then((items) => (this.tasks = items));
}
}
```
Here's a quick overview of the different parts of the code snippet:
- We've imported the `FormsModule` for angular's forms support
- `taskRepo` is a Remult [Repository](../../docs/ref_repository.md) object used to fetch and create Task entity objects.
- `tasks` is a Task array.
- The `ngOnInit` method calls theRemult [repository](../../docs/ref_repository.md)'s [find](../../docs/ref_repository.md#find) method to fetch tasks from the server, once when the component is loaded.
5. Replace the contents of `todo.component.html` with the following HTML:
```html
todos
{{task.title}}
```
After the browser refreshes, the list of tasks appears.
# angular - Tutorial - Paging, Sorting and Filtering
# Paging, Sorting and Filtering
The RESTful API created by Remult supports **server-side paging, sorting, and filtering**. Let's use that to limit, sort and filter the list of tasks.
## Limit Number of Fetched Tasks
Since our database may eventually contain a lot of tasks, it make sense to use a **paging strategy** to limit the number of tasks retrieved in a single fetch from the back-end database.
Let's limit the number of fetched tasks to `20`.
In the `ngOnInit` method, pass an `options` argument to the `find` method call and set its `limit` property to 20.
```ts{5}
// src/app/todo/todo.component.ts
ngOnInit() {
this.taskRepo.find({
limit: 20
}).then((items) => (this.tasks = items));
}
```
There aren't enough tasks in the database for this change to have an immediate effect, but it will have one later on when we'll add more tasks.
::: tip
To query subsequent pages, use the [Repository.find()](../../docs/ref_repository.md#find) method's `page` option.
:::
## Sorting By Creation Date
We would like old tasks to appear first in the list, and new tasks to appear last. Let's sort the tasks by their `createdAt` field.
In the `ngOnInit` method, set the `orderBy` property of the `find` method call's `option` argument to an object that contains the fields you want to sort by.
Use "asc" and "desc" to determine the sort order.
```ts{6}
// src/app/todo/todo.component.ts
ngOnInit() {
this.taskRepo.find({
limit: 20,
orderBy: { createdAt:"asc" }
}).then((items) => (this.tasks = items));
}
```
## Server side Filtering
Remult supports sending filter rules to the server to query only the tasks that we need.
Adjust the `ngOnInit` method to fetch only `completed` tasks.
```ts{7}
// src/app/todo/todo.component.ts
ngOnInit() {
this.taskRepo.find({
limit: 20,
orderBy: { createdAt:"asc" },
where: { completed: true }
}).then((items) => (this.tasks = items));
}
```
::: warning Note
Because the `completed` field is of type `boolean`, the argument is **compile-time checked to be of the `boolean` type**. Settings the `completed` filter to `undefined` causes it to be ignored by Remult.
:::
Play with different filtering values, and eventually comment it out, since we do need all the tasks
```ts{5}
ngOnInit() {
this.taskRepo.find({
limit: 20,
orderBy: { createdAt:"asc" },
//where: { completed: true }
}).then((items) => (this.tasks = items));
}
```
::: tip Learn more
Explore the reference for a [comprehensive list of filtering options](../../docs/entityFilter.md).
:::
# angular - Tutorial - CRUD Operations
# CRUD Operations
## Adding new tasks
Now that we can see the list of tasks, it's time to add a few more.
1. Add a `newTaskTitle` field and an `addTask` method to the `ToDoComponent`
```ts{4-13}
// src/app/todo/todo.component.ts
export class TodoComponent implements OnInit {
newTaskTitle = ""
async addTask() {
try {
const newTask = await this.taskRepo.insert({ title: this.newTaskTitle })
this.tasks.push(newTask)
this.newTaskTitle = ""
} catch (error: any) {
alert(error.message)
}
}
}
```
- the call to `taskRepo.insert` will make a post request to the server, insert the new task to the `db`, and return the new `Task` object with all it's info (including the id generated by the database)
2. Add an _Add Task_ form in the html template:
```html{5-12}
todos
{{ task.title }}
```
Try adding a few tasks to see how it works
## Rename Tasks and Mark as Completed
To make the tasks in the list updatable, we'll bind the `input` elements to the `Task` properties and add a _Save_ button to save the changes to the backend database.
1. Add a `saveTask` method to save the state of a task to the backend database
```ts
// src/app/todo/todo.component.ts
async saveTask(task: Task) {
try {
await this.taskRepo.save(task)
} catch (error: any) {
alert(error.message)
}
}
```
2) Modify the contents of the `tasks` div to include the following `input` elements and a _Save_ button to call the `saveTask` method.
```html{4-10}
```
Make some changes and refresh the browser to verify the backend database is updated.
## Delete Tasks
Let's add a _Delete_ button next to the _Save_ button of each task in the list.
1. Add the following `deleteTask` method to the `TodoComponent` class:
```ts
// src/app/todo/todo.component.ts
async deleteTask(task: Task) {
await this.taskRepo.delete(task);
this.tasks = this.tasks.filter(t => t !== task);
}
```
2. Add a _Delete_ button in the html:
```html{11}
```
# angular - Tutorial - Validation
# Validation
Validating user entered data is usually required both on the client-side and on the server-side, often causing a violation of the [DRY](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself) design principle. **With Remult, validation code can be placed within the entity class, and Remult will run the validation logic on both the frontend and the relevant API requests.**
::: warning Handling validation errors
When a validation error occurs, Remult will throw an exception.
In this tutorial, [CRUD operations](crud.md) catch these exceptions, and alert the user.
We leave it to you to decide how to handle validation errors in your application.
:::
## Validate the Title Field
Task titles are required. Let's add a validity check for this rule.
1. In the `Task` entity class, modify the `Fields.string` decorator for the `title` field to include an object literal argument and set the object's `validate` property to `Validators.required`.
```ts{3-5}
// src/shared/Task.ts
@Fields.string({
validate: Validators.required
})
title = ""
```
::: warning Import Validators
This code requires adding an import of `Validators` from `remult`.
:::
::: warning Manual browser refresh required
For this change to take effect, you **must manually refresh the browser**.
:::
After the browser is refreshed, try creating a new `task` or saving an existing one with an empty title - the _"Should not be empty"_ error message is displayed.
### Implicit server-side validation
The validation code we've added is called by Remult on the server-side to validate any API calls attempting to modify the `title` field.
Try making the following `POST` http request to the `http://localhost:3002/api/tasks` API route, providing an invalid title.
```sh
curl -i http://localhost:3002/api/tasks -d "{\"title\": \"\"}" -H "Content-Type: application/json"
```
An http error is returned and the validation error text is included in the response body,
## Custom Validation
The `validate` property of the first argument of `Remult` field decorators can be set to an arrow function which will be called to validate input on both front-end and back-end.
Try something like this and see what happens:
```ts
// src/shared/Task.ts
@Fields.string({
validate: (task) => {
if (task.title.length < 3) throw "Too Short"
}
})
title = ""
```
# angular - Tutorial - Live Queries
# Live Queries
Our todo list app can have multiple users using it at the same time. However, changes made by one user are not seen by others unless they manually refresh the browser.
Let's add realtime multiplayer capabilities to this app.
## One Time Setup
We'll need angular to run it's change detection when we receive messages from the backend - to do that we'll add the following code to `AppComponent`
```ts{3-5,7-9}
// src/app/app.component.ts
import { Component, NgZone } from '@angular/core';
import { remult } from "remult"
//...
export class AppComponent {
constructor(zone: NgZone) {
remult.apiClient.wrapMessageHandling = handler => zone.run(() => handler())
}
}
```
## Realtime updated todo list
Let's switch from fetching Tasks once when the Angular component is loaded, and manually maintaining state for CRUD operations, to using a realtime updated live query subscription **for both initial data fetching and subsequent state changes**.
1. Modify the contents of the `ngOnInit` method in the `Todo` component:
Modify the `TodoComponent` with the following changes
```ts{3,5,9,11-12,17,19-21}
// src/app/todo/todo.component.ts
import { Component, OnDestroy, OnInit } from '@angular/core';
...
export class TodoComponent implements OnInit, OnDestroy {
//...
taskRepo = remult.repo(Task)
tasks: Task[] = []
unsubscribe = () => {}
ngOnInit() {
this.unsubscribe = this.taskRepo
.liveQuery({
limit: 20,
orderBy: { createdAt: "asc" }
//where: { completed: true },
})
.subscribe(info => (this.tasks = info.applyChanges(this.tasks)))
}
ngOnDestroy() {
this.unsubscribe()
}
}
```
Let's review the change:
- Instead of calling the `repository`'s `find` method we now call the `liveQuery` method to define the query, and then call its `subscribe` method to establish a subscription which will update the Tasks state in realtime.
- The `subscribe` method accepts a callback with an `info` object that has 3 members:
- `items` - an up to date list of items representing the current result - it's useful for readonly use cases.
- `applyChanges` - a method that receives an array and applies the changes to it - we send that method to the `setTasks` state function, to apply the changes to the existing `tasks` state.
- `changes` - a detailed list of changes that were received
- The `subscribe` method returns an `unsubscribe` method, which we store in the `unsubscribe` member and call in the `ngOnDestroy` hook, so that it'll be called when the component unmounts.
2. As all relevant CRUD operations (made by all users) will **immediately update the component's state**, we should remove the manual adding of new Tasks to the component's state:
```ts{6}
// src/app/todo/todo.component.ts
async addTask() {
try {
const newTask = await this.taskRepo.insert({ title: this.newTaskTitle })
//this.tasks.push(newTask) <-- this line is no longer needed
this.newTaskTitle = ""
} catch (error: any) {
alert(error.message)
}
}
```
3. Optionally remove other redundant state changing code:
```ts{5}
// src/app/todo/todo.component.ts
async deleteTask(task: Task) {
await this.taskRepo.delete(task);
// this.tasks = this.tasks.filter(t => t !== task); <-- this line is no longer needed
}
```
Open the todo app in two (or more) browser windows/tabs, make some changes in one window and notice how the others are updated in realtime.
::: tip Under the hood
The default implementation of live-queries uses HTTP Server-Sent Events (SSE) to push realtime updates to clients, and stores live-query information in-memory.
For serverless environments _(or multi servers)_, live-query updates can be pushed using integration with third-party realtime providers, such as [Ably](https://ably.com/) (or others), and live-query information can be stored to any database supported by Remult.
:::
# angular - Tutorial - Backend methods
# Backend methods
When performing operations on multiple entity objects, performance considerations may necessitate running them on the server. **With Remult, moving client-side logic to run on the server is a simple refactoring**.
## Set All Tasks as Un/completed
Let's add two buttons to the todo app: "Set all as completed" and "Set all as uncompleted".
1. Add a `setAllCompleted` async method to the `TodoComponent` class, which accepts a `completed` boolean argument and sets the value of the `completed` field of all the tasks accordingly.
```ts
// src/app/todo/todo.component.ts
async setAllCompleted(completed: boolean) {
for (const task of await this.taskRepo.find()) {
await this.taskRepo.save({ ...task, completed });
}
}
```
The `for` loop iterates the array of `Task` objects returned from the backend, and saves each task back to the backend with a modified value in the `completed` field.
2. Add the two buttons to the `TodoComponent` just before the closing `` tag. Both of the buttons' `click` events will call the `setAllCompleted` method with the appropriate value of the `completed` argument.
```html
```
Make sure the buttons are working as expected before moving on to the next step.
## Refactor from Front-end to Back-end
With the current state of the `setAllCompleted` function, each modified task being saved causes an API `PUT` request handled separately by the server. As the number of tasks in the todo list grows, this may become a performance issue.
A simple way to prevent this is to expose an API endpoint for `setAllCompleted` requests, and run the same logic on the server instead of the client.
1. Create a new `TasksController` class, in the `shared` folder, and refactor the `for` loop from the `setAllCompleted` method of the `TodoComponent`into a new, `static`, `setAllCompleted` method in the `TasksController` class, which will run on the server.
```ts
// src/shared/TasksController.ts
import { BackendMethod, remult } from 'remult'
import { Task } from './Task'
export class TasksController {
@BackendMethod({ allowed: true })
static async setAllCompleted(completed: boolean) {
const taskRepo = remult.repo(Task)
for (const task of await taskRepo.find()) {
await taskRepo.save({ ...task, completed })
}
}
}
```
The `@BackendMethod` decorator tells Remult to expose the method as an API endpoint (the `allowed` property will be discussed later on in this tutorial).
**Unlike the front-end `Remult` object, the server implementation interacts directly with the database.**
2. Register `TasksController` by adding it to the `controllers` array of the `options` object passed to `remultExpress()`, in the server's `api` module:
```ts{4,8}
// src/server/api.ts
//...
import { TasksController } from "../shared/TasksController"
export const api = remultExpress({
//...
controllers: [TasksController]
})
```
3. Replace the `for` iteration in the `setAllCompleted` function of the `App` component with a call to the `setAllCompleted` method in the `TasksController`.
```ts{4}
// src/app/todo/todo.component.ts
async setAllCompleted(completed: boolean) {
await TasksController.setAllCompleted(completed);
}
```
::: warning Import TasksController
Remember to add an import of `TasksController` in `todo.component.ts`.
:::
::: tip Note
With Remult backend methods, argument types are compile-time checked. :thumbsup:
:::
After the browser refreshed, the _"Set all..."_ buttons function exactly the same, but much faster.
# angular - Tutorial - Authentication and Authorization
# Authentication and Authorization
Our todo app is nearly functionally complete, but it still doesn't fulfill a very basic requirement - that users should log in before they can view, create or modify tasks.
Remult provides a flexible mechanism that enables placing **code-based authorization rules** at various levels of the application's API. To maintain high code cohesion, **entity and field-level authorization code should be placed in entity classes**.
**Remult is completely unopinionated when it comes to user authentication.** You are free to use any kind of authentication mechanism, and only required to provide Remult with an object which implements the Remult `UserInfo` interface.
In this tutorial, we'll use `Express`'s [cookie-session](https://expressjs.com/en/resources/middleware/cookie-session.html) middleware to store an authenticated user's session within a cookie. The `user` property of the session will be set by the API server upon a successful simplistic sign-in (based on username without password).
## Tasks CRUD Requires Sign-in
This rule is implemented within the `Task` `@Entity` decorator, by modifying the value of the `allowApiCrud` property.
This property can be set to a function that accepts a `Remult` argument and returns a `boolean` value. Let's use the `Allow.authenticated` function from Remult.
```ts{4}
// src/shared/Task.ts
@Entity("tasks", {
allowApiCrud: Allow.authenticated
})
```
::: warning Import Allow
This code requires adding an import of `Allow` from `remult`.
:::
After the browser refreshes, **the list of tasks disappeared** and the user can no longer create new tasks.
::: details Inspect the HTTP error returned by the API using cURL
```sh
curl -i http://localhost:3002/api/tasks
```
:::
::: danger Authorized server-side code can still modify tasks
Although client CRUD requests to `tasks` API endpoints now require a signed-in user, the API endpoint created for our `setAllCompleted` server function remains available to unauthenticated requests. Since the `allowApiCrud` rule we implemented does not affect the server-side code's ability to use the `Task` entity class for performing database CRUD operations, **the `setAllCompleted` function still works as before**.
To fix this, let's implement the same rule using the `@BackendMethod` decorator of the `setAllCompleted` method of `TasksController`.
```ts
// src/shared/TasksController.ts
@BackendMethod({ allowed: Allow.authenticated })
```
**This code requires adding an import of `Allow` from `remult`.**
:::
## User Authentication
Let's add a sign-in area to the todo app, with an `input` for typing in a `username` and a sign-in `button`. The app will have two valid `username` values - _"Jane"_ and _"Steve"_. After a successful sign-in, the sign-in area will be replaced by a "Hi [username]" message.
### Backend setup
1. Open a terminal and run the following command to install the required packages:
```sh
npm i cookie-session
npm i --save-dev @types/cookie-session
```
2. Modify the main server module `index.ts` to use the `cookie-session` Express middleware.
```ts{5,8-12}
// src/server/index.ts
//...
import session from "cookie-session"
const app = express()
app.use(
session({
secret: process.env["SESSION_SECRET"] || "my secret"
})
)
//...
```
The `cookie-session` middleware stores session data, digitally signed using the value of the `secret` property, in an `httpOnly` cookie, sent by the browser to all subsequent API requests.
3. add a `shared/AuthController.ts` file and include the following code:
```ts add={4-6,8-12}
// src/shared/AuthController.ts
import { BackendMethod, remult } from 'remult'
import type express from 'express'
// eslint-disable-next-line @typescript-eslint/no-unused-vars
import type from 'cookie-session' // required to access the session member of the request object
declare module 'remult' {
export interface RemultContext {
request?: express.Request
}
}
export class AuthController {
//
}
```
### Code Explanation
- We import the necessary modules from `remult` and types for `express` and `cookie-session`.
- We extend the `RemultContext` interface to include an optional `request` property of type `express.Request`.
- Remult will automatically set the `request` with the current request. Since Remult works with any server framework, we need to type it to the correct server, which in this case is Express. This typing gives us access to the request object and its session, managed by `cookie-session`.
- This `request` can be accessed using `remult.context.request`.
Next, we'll add a static list of users and a sign-in method. (In a real application, you would use a database, but for this tutorial, a static list will suffice.)
```ts add={1,4-17}
const validUsers = [{ name: 'Jane' }, { name: 'Alex' }]
export class AuthController {
@BackendMethod({ allowed: true })
static async signIn(name: string) {
const user = validUsers.find((user) => user.name === name)
if (user) {
remult.user = {
id: user.name,
name: user.name,
}
remult.context.request!.session!['user'] = remult.user
return remult.user
} else {
throw Error("Invalid user, try 'Alex' or 'Jane'")
}
}
}
```
### Code Explanation
- We define a static list of valid users.
- The `signIn` method is decorated with `@BackendMethod({ allowed: true })`, making it accessible from the frontend.
- The method checks if the provided `name` exists in the `validUsers` list. If it does, it sets `remult.user` to an object that conforms to the `UserInfo` type from Remult and stores this user in the request session.
- If the user is not found, it throws an error.
Next, we'll add the sign-out method:
```ts add={7-11}
export class AuthController {
@BackendMethod({ allowed: true })
static async signIn(name: string) {
//...
}
@BackendMethod({ allowed: true })
static async signOut() {
remult.context.request!.session!['user'] = undefined
return undefined
}
}
```
- The `signOut` method clears the user session, making the user unauthenticated.
4. Update `remultExpress` configuration.
```ts{3,5,6}
// src/server/api.ts
import { AuthController } from '../shared/AuthController.js'
export const api = remultExpress({
//...
controllers: [TaskController, AuthController]
getUser: (req) => req.session!['user'],
})
```
### Code Explanation
- Register the `AuthController` so that the frontend can call its `signIn` and `signOut` methods
- `getUser` function: The getUser function is responsible for extracting the user information from the session. If a user is found in the session, Remult will treat the request as authenticated, and this user will be used for authorization purposes.
### Frontend setup
1. Create an `Auth` component using Angular's cli
```sh
ng g c auth
```
2. Add the highlighted code lines to the `AuthComponent` class file:
```ts
// src/app/auth/auth.component.ts
import { Component, OnInit } from '@angular/core'
import { CommonModule } from '@angular/common'
import { UserInfo, remult } from 'remult'
import { HttpClient, HttpClientModule } from '@angular/common/http'
import { FormsModule } from '@angular/forms'
import { TodoComponent } from '../todo/todo.component'
@Component({
selector: 'app-auth',
standalone: true,
imports: [CommonModule, FormsModule, TodoComponent, HttpClientModule],
templateUrl: './auth.component.html',
styleUrl: './auth.component.css',
})
export class AuthComponent implements OnInit {
signInUsername = ''
remult = remult
async signIn() {
try {
remult.user = await AuthController.signIn(this.signInUsername)
} catch (error: unknown) {
alert((error as { message: string }).message)
}
}
async signOut() {
await AuthController.signOut()
remult.user = undefined
}
ngOnInit() {
remult.initUser()
}
}
```
3. Replace the contents of auth.component.html with the following html:
```html
todos
Hello {{ remult.user?.name }}
```
4. Replace the `TodoComponent` with the `AuthComponent` in the `AppComponent`
```ts{6,12}
//src/app/app.component.ts
import { Component, NgZone } from '@angular/core';
import { CommonModule } from '@angular/common';
import { RouterOutlet } from '@angular/router';
import { AuthComponent } from './auth/auth.component';
import { remult } from 'remult';
@Component({
selector: 'app-root',
standalone: true,
imports: [CommonModule, RouterOutlet, AuthComponent],
templateUrl: './app.component.html',
styleUrl: './app.component.css',
})
```
5. Change the `app.component.html` to use the `AuthComponent` instead of the `TodoComponent`
```html
```
The todo app now supports signing in and out, with **all access restricted to signed in users only**.
## Role-based Authorization
Usually, not all application users have the same privileges. Let's define an `admin` role for our todo app, and enforce the following authorization rules:
- All signed in users can see the list of tasks.
- All signed in users can set specific tasks as `completed`.
- Only users belonging to the `admin` role can create, delete or edit the titles of tasks.
1. Modify the highlighted lines in the `Task` entity class to reflect the top three authorization rules.
```ts{7-8,18}
// src/shared/Task.ts
import { Allow, Entity, Fields, Validators } from "remult"
@Entity("tasks", {
allowApiCrud: Allow.authenticated,
allowApiInsert: "admin",
allowApiDelete: "admin"
})
export class Task {
@Fields.uuid()
id!: string
@Fields.string({
validate: (task) => {
if (task.title.length < 3) throw "Too Short"
}
allowApiUpdate: "admin"
})
title = ""
@Fields.boolean()
completed = false
}
```
2. Let's give the user _"Jane"_ the `admin` role by modifying the `roles` array of her `validUsers` entry.
```ts{3,13}
// src/shared/AuthController.ts
const validUsers = [{ name: "Jane", admin: true }, { name: "Steve" }];
export class AuthController {
@BackendMethod({ allowed: true })
static async signIn(name: string) {
const user = validUsers.find((user) => user.name === name);
if (user) {
remult.user = {
id: user.name,
name: user.name,
roles: user.admin ? ["admin"] : [],
};
remult.context.request!.session!["user"] = remult.user;
return remult.user;
} else {
throw Error("Invalid user, try 'Steve' or 'Jane'");
}
}
```
**Sign in to the app as _"Steve"_ to test that the actions restricted to `admin` users are not allowed. :lock:**
## Role-based Authorization on the Frontend
From a user experience perspective it only makes sense that users that can't add or delete, would not see these buttons.
Let's reuse the same definitions on the Frontend.
Modify the contents of todo.component.html to only display the form and delete buttons if these operations are allowed based on the entity's metadata:
```html{5,22}
todos
```
This way we can keep the frontend consistent with the `api`'s Authorization rules
- Note We send the `task` to the `apiDeleteAllowed` method, because the `apiDeleteAllowed` option, can be sophisticated and can also be based on the specific item's values,
# angular - Tutorial - Database
# Database
Up until now the todo app has been using a plain JSON file to store the list of tasks. **In production, we'd like to use a `Postgres` database table instead.**
::: tip Learn more
See the [Quickstart](https://remult.dev/docs/quickstart.html#connecting-a-database) article for the (long) list of relational and non-relational databases Remult supports.
:::
::: warning Don't have Postgres installed? Don't have to.
Don't worry if you don't have Postgres installed locally. In the next step of the tutorial, we'll configure the app to use Postgres in production, and keep using JSON files in our dev environment.
**Simply install `postgres-node` per step 1 below and move on to the [Deployment section of the tutorial](deployment.md).**
:::
1. Install `postgres-node` ("pg").
```sh
npm i pg
```
2. Add the highlighted code to the `api` server module.
```ts{5,9-11}
// src/server/api.ts
//...
import { createPostgresDataProvider } from "remult/postgres"
export const api = remultExpress({
//...
dataProvider: createPostgresDataProvider({
connectionString: "your connection string"
})
})
```
# angular - Tutorial - Deployment
# Deployment
Let's deploy the todo app to [railway.app](https://railway.app/).
## Prepare for Production
In this tutorial, we'll deploy both the Angular app and the API server as [one server-side app](https://create-react-app.dev/docs/deployment/#other-solutions), and redirect all non-API requests to return the Angular app.
In addition, to follow a few basic production best practices, we'll use [compression](https://www.npmjs.com/package/compression) middleware to improve performance and [helmet](https://www.npmjs.com/package/helmet) middleware for security
1. Add the highlighted code lines to `src/server/index.ts`, and modify the `app.listen` function's `port` argument to prefer a port number provided by the production host's `PORT` environment variable.
```ts{16-21}
// src/server/index.ts
import express from "express"
import { api } from "./api.js"
import session from "cookie-session"
import { auth } from "./auth.js"
const app = express()
app.use(
session({
secret: process.env["SESSION_SECRET"] || "my secret"
})
)
app.use(auth)
app.use(api)
const frontendFiles = process.cwd() + "/dist/remult-angular-todo/browser";
app.use(express.static(frontendFiles));
app.get("/*", (_, res) => {
res.sendFile(frontendFiles + "/index.html");
});
app.listen(process.env["PORT"] || 3002, () => console.log("Server started"));
```
::: warning Angular versions <17
If you're using angular version 16 or less, the result path is: `'/dist/remult-angular-todo/browser`
:::
3. Modify the highlighted code in the api server module to prefer a `connectionString` provided by the production host's `DATABASE_URL` environment variable.
```ts{4,7-9}
// src/server/api.ts
//...
const DATABASE_URL = process.env["DATABASE_URL"];
export const api = remultExpress({
dataProvider: DATABASE_URL
? createPostgresDataProvider({ connectionString: DATABASE_URL })
: undefined,
//...
})
```
::: warning Note
In order to connect to a local PostgresDB, add `DATABASE_URL` to an .env file, or simply replace `process.env["DATABASE_URL"]` with your `connectionString`.
If no `DATABASE_URL` has found, it'll fallback to our local JSON files.
:::
4. In the root folder, create a TypeScript configuration file `tsconfig.server.json` for the build of the server project using TypeScript.
```json
// tsconfig.server.json
{
"extends": "./tsconfig.json",
"compilerOptions": {
"module": "commonjs",
"esModuleInterop": true,
"noEmit": false,
"outDir": "dist",
"skipLibCheck": true,
"rootDir": "src"
},
"include": ["src/server/index.ts"]
}
```
5. Modify the project's `build` npm script to additionally transpile the API server's TypeScript code to JavaScript (using `tsc`).
```json
// package.json
"build": "ng build && tsc -p tsconfig.server.json"
```
6. Modify the project's `start` npm script to start the production Node.js server.
```json
// package.json
"start": "node dist/server/"
```
The todo app is now ready for deployment to production.
## Test Locally
To test the application locally run
```sh
npm run build
npm run start
```
Now navigate to http://localhost:3002 and test the application locally
## Deploy to Railway
In order to deploy the todo app to [railway](https://railway.app/) you'll need a `railway` account. You'll also need [Railway CLI](https://docs.railway.app/develop/cli#npm) installed, and you'll need to login to railway from the cli, using `railway login`.
Click enter multiple times to answer all its questions with the default answer
1. Create a Railway `project`.
From the terminal in your project folder run:
```sh
railway init
```
2. Set a project name.
3. Once that's done run the following command to open the project on railway.dev:
```sh
railway open
```
4. Once that's done run the following command to upload the project to railway:
```sh
railway up
```
5. Add Postgres Database:
1. In the project on `railway.dev`, click `+ Create`
2. Select `Database`
3. Select `Add PostgresSQL`
6. Configure the environment variables
1. Click on the project card (not the Postgres one)
2. Switch to the `variables` tab
3. Click on `+ New Variable`, and in the `VARIABLE_NAME` click `Add Reference` and select `DATABASE_URL`
4. Add another variable called `SESSION_SECRET` and set it to a random string, you can use an [online UUID generator](https://www.uuidgenerator.net/)
5. Switch to the `settings` tab
6. Under `Environment` click on `Generate Domain`
7. Click on the `Deploy` button on the top left.
7. Once the deployment is complete -
8. Click on the newly generated url to open the app in the browser and you'll see the app live in production. (it may take a few minutes to go live)
::: warning Note
If you run into trouble deploying the app to Railway, try using Railway's [documentation](https://docs.railway.app/deploy/deployments).
:::
That's it - our application is deployed to production, play with it and enjoy.
Love Remult? Give our repo a star.⭐
# angular - Tutorial - Appendix: Observable Live Query
# Appendix - Observable Live Query
To use `liveQuery` as an observable add the following utility function to your code
```ts
// src/app/from-live-query.ts
import { LiveQuery } from 'remult'
import { Observable } from 'rxjs'
export function fromLiveQuery(q: LiveQuery) {
return new Observable((sub) =>
q.subscribe(({ items }) => sub.next(items)),
)
}
```
1. Adjust the `TodoComponent`
```ts{4,6-11}
// src/app/todo/todo.component.ts
...
export class TodoComponent {
taskRepo = remult.repo(Task);
tasks$ = fromLiveQuery(
this.taskRepo.liveQuery({
limit: 20,
orderBy: { createdAt: 'asc' },
})
);
```
Note that we've removed `ngOnInit` and `ngOnDestroy` as they are no longer needed
2. Adjust the `todo.component.html`
```html{3}
```
# vue - Tutorial - Setup
# Build a Full-Stack Vue Application
### Create a simple todo app with Remult using a Vue frontend
In this tutorial, we are going to create a simple app to manage a task list. We'll use `Vue` for the UI, `Node.js` + `Express.js` for the API server, and Remult as our full-stack CRUD framework. For deployment to production, we'll use [railway.app](https://railway.app/) and a `PostgreSQL` database.
By the end of the tutorial, you should have a basic understanding of Remult and how to use it to accelerate and simplify full stack app development.
::: tip Prefer React?
Check out the [React tutorial](../react/).
:::
### Prerequisites
This tutorial assumes you are familiar with `TypeScript` and `Vue`.
Before you begin, make sure you have [Node.js](https://nodejs.org) and [git](https://git-scm.com/) installed.
# Setup for the Tutorial
This tutorial requires setting up a Vue project, an API server project, and a few lines of code to add Remult.
You can either **use a starter project** to speed things up, or go through the **step-by-step setup**.
## Option 1: Clone the Starter Project
1. Clone the _vue-express-starter_ repository from GitHub and install its dependencies.
```sh
git clone https://github.com/remult/vue-express-starter.git remult-vue-todo
cd remult-vue-todo
npm install
```
2. Open your IDE.
3. Open a terminal and run the `dev` npm script.
```sh
npm run dev
```
4. Open another terminal and run the `dev-node` npm script
```sh
npm run dev-node
```
The default "Vue" app main screen should be available at the default Vite dev server address [http://127.0.0.1:5173](http://127.0.0.1:5173).
At this point, our starter project is up and running. We are now ready to move to the [next step of the tutorial](./entities.md) and start creating the task list app.
## Option 2: Step-by-step Setup
### Create a Vue project
Create the new Vue project.
```sh
npm init -y vue@latest
```
The command command prompts you for information about features to include in the initial app project. Here are the answers used in this tutorial:
1. Project name: ... **remult-vue-todo**
2. Add Typescript? ... **Yes**
3. For the rest of the answers, simply select the default.
::: warning Run into issues scaffolding the Vite project?
See [Vite documentation](https://vitejs.dev/guide/#scaffolding-your-first-vite-project) for help.
:::
Once completed, run:
```sh
cd remult-vue-todo
```
In this tutorial, we'll be using the root folder created by `Vue` as the root folder for our server project as well.
### Install required packages
We need `Express` to serve our app's API, and, of course, `Remult`. For development, we'll use [tsx](https://www.npmjs.com/package/tsx) to run the API server.
```sh
npm i express remult
npm i --save-dev @types/express tsx
```
### Create the API server project
The starter API server TypeScript project contains a single module that initializes `Express`, and begins listening for API requests.
1. Open your IDE.
2. Create a `server` folder under the `src/` folder created by Vite.
3. Create an `index.ts` file in the `src/server/` folder with the following code:
```ts
// src/server/index.ts
import express from 'express'
const app = express()
app.listen(3002, () => console.log('Server started'))
```
### Bootstrap Remult in the back-end
Remult is loaded in the back-end as an `Express middleware`.
1. Create an `api.ts` file in the `src/server/` folder with the following code:
```ts
// src/server/api.ts
import { remultExpress } from 'remult/remult-express'
export const api = remultExpress()
```
2. Add the highlighted code lines to register the middleware in the main server module `index.ts`.
```ts{4,7}
// src/server/index.ts
import express from "express"
import { api } from "./api.js"
const app = express()
app.use(api)
app.listen(3002, () => console.log("Server started"))
```
::: warning ESM
In this tutorial we will be using `esm` for the node.js server - that means that where ever we import a file we have to include the `.js` suffix for it as we did above in the `import { api } from "./api.js` statement
:::
### Final tweaks
Our full stack starter project is almost ready. Let's complete these final configurations.
#### Enable TypeScript decorators in Vite
Add the following entry to the `defineConfig` section of the `vite.config.ts` file to enable the use of decorators in the Vue app.
```ts{6-12}
// vite.config.ts
// ...
export default defineConfig({
plugins: [vue()],
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
});
```
#### Create the server tsconfig file
In the root folder, create a TypeScript configuration file `tsconfig.server.json` for the server project.
```json
{
"compilerOptions": {
"experimentalDecorators": true,
"skipLibCheck": true,
"esModuleInterop": true,
"outDir": "dist",
"rootDir": "src",
"module": "nodenext"
},
"include": ["src/server/**/*", "src/shared/**/*"]
}
```
#### Proxy API requests from Vue DevServer (vite) to the API server
The Vue app created in this tutorial is intended to be served from the same domain as its API.
However, for development, the API server will be listening on `http://localhost:3002`, while the Vue app is served from the default `http://localhost:5173`.
We'll use the [proxy](https://vitejs.dev/config/#server-proxy) feature of Vite to divert all calls for `http://localhost:5173/api` to our dev API server.
Configure the proxy by adding the following entry to the `vite.config.ts` file:
```ts{6}
// vite.config.ts
//...
export default defineConfig({
plugins: [vue()],
server: { proxy: { "/api": "http://localhost:3002" } },
esbuild: {
tsconfigRaw: {
compilerOptions: {
experimentalDecorators: true,
},
},
},
});
```
### Run the app
1. Open a terminal and start the vite dev server.
```sh
npm run dev
```
2. Add an `npm` script named `dev-node` to start the dev API server in the `package.json`.
```json
// package.json
"dev-node": "tsx watch --tsconfig tsconfig.server.json src/server"
```
3. Open another terminal and start the `node` server
```sh
npm run dev-node
```
The server is now running and listening on port 3002. `tsx` is watching for file changes and will restart the server when code changes are saved.
The default "Vue" app main screen should be available at the default Vite dev server address [http://127.0.0.1:5173](http://127.0.0.1:5173).
### Remove Vue default styles
The vue default styles won't fit our todo app. If you'd like a nice-looking app, replace the contents of `src/assets/main.css` with [this CSS file](https://raw.githubusercontent.com/remult/vue-express-starter/master/src/assets/main.css). Otherwise, you can simply **delete the contents of `src/assets/main.css`**.
### Setup completed
At this point, our starter project is up and running. We are now ready to move to the [next step of the tutorial](./entities.md) and start creating the task list app.
# vue - Tutorial - Entities
# Entities
Let's start coding the app by defining the `Task` entity class.
The `Task` entity class will be used:
- As a model class for client-side code
- As a model class for server-side code
- By `remult` to generate API endpoints, API queries, and database commands
The `Task` entity class we're creating will have an auto-generated `id` field, a `title` field, a `completed` field and an auto-generated `createdAt` field. The entity's API route ("tasks") will include endpoints for all `CRUD` operations.
## Define the Model
1. Create a `shared` folder under the `src` folder. This folder will contain code shared between frontend and backend.
2. Create a file `Task.ts` in the `src/shared/` folder, with the following code:
```ts
// src/shared/Task.ts
import { Entity, Fields } from "remult"
@Entity("tasks", {
allowApiCrud: true
})
export class Task {
@Fields.cuid()
id = ""
@Fields.string()
title = ""
@Fields.boolean()
completed = false
@Fields.createdAt()
createdAt?: Date
}
```
3. In the server's `api` module, register the `Task` entity with Remult by adding `entities: [Task]` to an `options` object you pass to the `remultExpress()` middleware:
```ts{4,7}
// src/server/api.ts
import { remultExpress } from "remult/remult-express"
import { Task } from "../shared/Task.js"
export const api = remultExpress({
entities: [Task]
})
```
::: warning ESM
In this tutorial we will be using `esm` for the node.js server - that means that where ever we import a file we have to include the `.js` suffix for it as we did above in the `import { Task } from "../shared/Task.js"` statement
:::
The [@Entity](../../docs/ref_entity.md) decorator tells Remult this class is an entity class. The decorator accepts a `key` argument (used to name the API route and as a default database collection/table name), and an `options` argument used to define entity-related properties and operations, discussed in the next sections of this tutorial.
To initially allow all CRUD operations for tasks, we set the option [allowApiCrud](../../docs/ref_entity.md#allowapicrud) to `true`.
The [@Fields.cuid](../../docs/field-types.md#fields-cuid) decorator tells Remult to automatically generate a short random id using the [cuid](https://github.com/paralleldrive/cuid) library. This value can't be changed after the entity is created.
The [@Fields.string](../../docs/field-types.md#fields-string) decorator tells Remult the `title` property is an entity data field of type `String`. This decorator is also used to define field-related properties and operations, discussed in the next sections of this tutorial and the same goes for `@Fields.boolean` and the `completed` property.
The [@Fields.createdAt](../../docs/field-types.md#fields-createdat) decorator tells Remult to automatically generate a `createdAt` field with the current date and time.
::: tip
For a complete list of supported field types, see the [Field Types](../../docs/field-types.md) section in the Remult documentation.
:::
## Test the API
Now that the `Task` entity is defined, we can start using the REST API to query and add a tasks.
1. Open a browser with the url: [http://localhost:3002/api/tasks](http://localhost:3002/api/tasks), and you'll see that you get an empty array.
2. Use `curl` to `POST` a new task - *Clean car*.
```sh
curl http://localhost:3002/api/tasks -d "{\"title\": \"Clean car\"}" -H "Content-Type: application/json"
```
3. Refresh the browser for the url: [http://localhost:3002/api/tasks](http://localhost:3002/api/tasks) and see that the array now contains one item.
4. Use `curl` to `POST` a few more tasks:
```sh
curl http://localhost:3002/api/tasks -d "[{\"title\": \"Read a book\"},{\"title\": \"Take a nap\", \"completed\":true },{\"title\": \"Pay bills\"},{\"title\": \"Do laundry\"}]" -H "Content-Type: application/json"
```
- Note that the `POST` endpoint can accept a single `Task` or an array of `Task`s.
5. Refresh the browser again, to see that the tasks were stored in the db.
::: warning Wait, where is the backend database?
While remult supports [many relational and non-relational databases](https://remult.dev/docs/databases.html), in this tutorial we start by storing entity data in a backend **JSON file**. Notice that a `db` folder has been created under the root folder, with a `tasks.json` file containing the created tasks.
:::
## Admin UI
### Enabling the Admin UI
Add the Admin UI to your React application by setting the `admin` option to `true` in the `remultExpress()`
::: code-group
```ts [src/server/api.ts]
import { remultExpress } from 'remult/remult-express'
import { Task } from '../shared/Task.js'
export const api = remultExpress({
entities: [Task],
admin: true, // Enable the Admin UI
})
```
:::
### Accessing and Using the Admin UI
Navigate to `http://localhost:5173/api/admin` to access the Admin UI. Here, you can perform CRUD operations on your entities, view their relationships via the Diagram entry, and ensure secure management with the same validations and authorizations as your application.
![Remult Admin](/remult-admin.png)
### Features
- **CRUD Operations**: Directly create, update, and delete tasks through the Admin UI.
- **Entity Diagram**: Visualize relationships between entities for better data structure understanding.
- **Security**: Operations are secure, adhering to application-defined rules.
## Display the Task List
Let's start developing the web app by displaying the list of existing tasks in a Vue component.
Replace the contents of `src/App.vue` with the following code:
```vue
// src/App.vue
todos
{{ task.title }}
```
Here's a quick overview of the different parts of the code snippet:
- `taskRepo` is a Remult [Repository](../../docs/ref_repository.md) object used to fetch and create Task entity objects.
- `tasks` is a Task array Vue `ref` that holds the list of tasks.
- Vue's `onMounted` hook is used to call the Remult [repository](../../docs/ref_repository.md)'s [find](../../docs/ref_repository.md#find) method to fetch tasks from the server, once when the Vue component is loaded.
After the browser refreshes, the list of tasks appears.
# vue - Tutorial - Paging, Sorting and Filtering
# Paging, Sorting and Filtering
The RESTful API created by Remult supports **server-side paging, sorting, and filtering**. Let's use that to limit, sort and filter the list of tasks.
## Limit Number of Fetched Tasks
Since our database may eventually contain a lot of tasks, it make sense to use a **paging strategy** to limit the number of tasks retrieved in a single fetch from the back-end database.
Let's limit the number of fetched tasks to `20`.
In the `onMounted` hook, pass an `options` argument to the `find` method call and set its `limit` property to 20.
```ts{6}
// src/App.vue
onMounted(() =>
taskRepo
.find({
limit: 20
})
.then(items => (tasks.value = items))
)
```
There aren't enough tasks in the database for this change to have an immediate effect, but it will have one later on when we'll add more tasks.
::: tip
To query subsequent pages, use the [Repository.find()](../../docs/ref_repository.md#find) method's `page` option.
:::
## Sorting By Creation Date
We would like old tasks to appear first in the list, and new tasks to appear last. Let's sort the tasks by their `createdAt` field.
In the `onMounted` hook, set the `orderBy` property of the `find` method call's `option` argument to an object that contains the fields you want to sort by.
Use "asc" and "desc" to determine the sort order.
```ts{7}
// src/App.vue
onMounted(() =>
taskRepo
.find({
limit: 20,
orderBy: { createdAt: "asc" }
})
.then(items => (tasks.value = items))
)
```
## Server side Filtering
Remult supports sending filter rules to the server to query only the tasks that we need.
Adjust the `onMounted` hook to fetch only `completed` tasks.
```ts{8}
// src/App.vue
onMounted(() =>
taskRepo
.find({
limit: 20,
orderBy: { createdAt: "asc" },
where: { completed: true }
})
.then(items => (tasks.value = items))
)
```
::: warning Note
Because the `completed` field is of type `boolean`, the argument is **compile-time checked to be of the `boolean` type**. Settings the `completed` filter to `undefined` causes it to be ignored by Remult.
:::
Play with different filtering values, and eventually comment it out, since we do need all the tasks
```ts{6}
onMounted(() =>
taskRepo
.find({
limit: 20,
orderBy: { createdAt: "asc" }
//where: { completed: true }
})
.then(items => (tasks.value = items))
)
```
::: tip Learn more
Explore the reference for a [comprehensive list of filtering options](../../docs/entityFilter.md).
:::
# vue - Tutorial - CRUD Operations
# CRUD Operations
## Adding new tasks
Now that we can see the list of tasks, it's time to add a few more.
Add the highlighted `newTaskTitle` ref and `addTask` function, and the relevant `