If you've been paying attention to Web Dev Twitter, it'd be hard to miss the increasing number of tweets decrying the choice of GraphQL for their applications. and how much better tRPC or their framework is. Let's dig into that.
“Most apps don’t benefit from [GraphQL].” https://t.co/WVyvrRPSpp
— ✨ Josh Branchaud 🦃 (@jbrancha) November 28, 2022
While some of this is definitely hyperbolic, it can be distilled down to a much simpler problem. How do developers most easily connect their frontend code, which is responsible for displaying and collecting data, and the backend, which stores and manipulates that. GraphQL was not designed to be that!
GraphQL was designed to be a type safe declarative data collection schema language. Facebook made it to collect only the data they needed from their myriad network of microservices flexibly. It's supposed to insert itself into your data pipeline, and provide type guarantees from a loosely connected group of interrelated services. It has an entire type system that needs to be converted into and out ofto the destination language of choice. Using it solely for it's type safety guarantees is missing the forest for the trees. Roy Derks does a great job of laying out the differences, so check that out if you want more detail.
Solutions
What should tRPC be compared with? Well, I think it's fair game to compare it against other solutions to the server-client problem across the web ecosystem. Let's look at a few of them:
API Frameworks
tRPC
tRPC bills itself as the solution to providing end-to-end typesafe APIs for Typescript in React, and it's hard to argue with that. A change made to a server function signature will produce a Typescript error in the client, and vice versa. tRPC is heavily tied to Typescript, so if you use another language, you are out of luck. It definitely has it's proponents, notably Theo of Ping.gg.
The amount that tRPC has improved the quality of our code, the speed of our delivery, and the happiness of our devs is hard to comprehend.
— Theo - ping.gg (@t3dotgg) September 19, 2022
I know I shill it a lot but seriously, please try @trpcio
tRPC is designed to provide exportable types from the server, and an interface to call procedures(functions) from the client. Because it doesn't abstract away the serialization and transport layer completely, it also recommends Zod to validate the client inputs, requiring a bit more work to implemnt. Let's see how one would implement this, straight from the tRPC docs for Next.js.
Skipping install steps, we need to create a router to process the incoming API calls.
// server/tprc.ts // This creates the router instance along with its helpers import { TRPCError, initTRPC } from '@trpc/server'; // Avoid exporting the entire t-object // since it's not very descriptive. // For instance, the use of a t variable // is common in i18n libraries. const t = initTRPC.create(); // Base router and procedure helpers export const router = t.router; export const procedure = t.procedure;
Then the router needs to be imported and constructed in your server/routers/_app.ts file
// server/routers/_app.ts import { z } from 'zod'; import { procedure, router } from '../trpc'; export const appRouter = router({ hello: procedure .input( z.object({ text: z.string(), }), ) // You'd want to break the functions out into separate files in a prod app .query(({ input }) => { return { greeting: `hello ${input.text}`, }; }), }); // export type definition of API export type AppRouter = typeof appRouter;
Then the trpc routes need to be registered as a wildcard route.
// pages/api/trpc/[trpc].ts import * as trpcNext from '@trpc/server/adapters/next'; import { appRouter } from '../../../server/routers/_app'; // export API handler export default trpcNext.createNextApiHandler({ router: appRouter, createContext: () => ({}), });
Export the global tRPC object as a global from the app root.
// pages/_app.tsx import type { AppType } from 'next/app'; import { trpc } from '../utils/trpc'; const MyApp: AppType = ({ Component, pageProps }) => { return <Component {...pageProps} />; }; export default trpc.withTRPC(MyApp);
Finally, it can be called from your React component.
// pages/index.tsx import { trpc } from '../utils/trpc'; export default function IndexPage() { const hello = trpc.hello.useQuery({ text: 'client' }); if (!hello.data) { return <div>Loading...</div>; } return ( <div> <p>{hello.data.greeting}</p> </div> ); }
Admittedly, this is the stack I am least familar with, but this also seems to be the most clunky setup of the options we'll discuss. Most of the RPC approaches we'll discuss externalize most of the work to a compiler through the use of macros or custom bundler plugins. It's hard for me to get excited about the need to both define a function and input validation in the router, and then to call it using a property of a property in an object. I know your VSCode Intellisense will help you get everything right, it's just a lot of boilerplate.
Not to say that tRPC isn't the best approach for React/Next.js apps, I haven't done nearly enough with it to be sure. It does have a very nice integration with Tanstack Query, and the type safety is quite real. I'm just not in love with the API. If you'd like to check out tRPC, one good way is to use the t3 Stack, Theo's starter stack for Next.js and tRPC.
gRPC
If multi-language support is a requirement, gRPC/gRPC-web definitely can pull its weight. It provides a schema to enforce typesafety across a variety of languages, has more types than GraphQL, can encode to binary, can handle streaming*, and is much faster than REST. It's missing a decent Typescript server, but if your backend runs in Go, Rust, or another language with one, then the lovely folks at Buf have created a Typescript library called connect-web
here that will automatically generate Typescript bindings from the schema.
gRPC is not a closely coupled solution between a client and a server. In that way, it's better to compare it to GraphQL than tRPC. It requires a lot of the same intermediate steps that GraphQL does. Types must be generated from a protobuf
file, and then imported in your web client. I'd reach for this if the data being transferred between frontend and backend is primarily binary(like images and video), or if you need streaming. gRPC is much stronger communicating between servers or between servers and mobile/desktop apps, than it is between servers and the browser due to some of the limitations of gRPC-web. If you're interested in learning more about gRPC, I wrote about it a short time ago here.
const answer = await eliza.say({sentence: "I feel happy."}); console.log(answer); // {sentence: 'When you feel happy, what do you do?'}
connect-web has a pretty nice UI though
Framework Solutions
Remix
I owe Remix a debt, in that they introduced me to the idea of the client-server gap in the first place. As Ryan Florence proclaimed on Twitter in Jan 2022, Remix is a "center stack" framework. Remix's core argument is in essence
API framework? The browser has everything we need right here.
Forget “full stack”. Remix is center stack.
— Ryan Florence (@ryanflorence) January 15, 2022
It’s an HTTP tunnel between interactivity and business logic. It’s the part of the stack that no tool before it has solved, not completely.
I think I finally know how to talk about Remix.
Utilizing a custom Typescript/Javascript bundler built on esbuild, the team made it possible to colocate the server code and the client code in the same file. Querying data is as simple as sending a GET Request to a particular endpoint, which runs a loader
function and returns a Response. Data mutations are POST Requests to an endpoint, which runs the action
function. Action inputs are typically form submissions, encoded with application/x-www-form-urlencoded
and Responses are JSON objects that can be handled by the loader and passed to your React components with handy utility functions. End to end typesafey can be inferred from the Response types included in the loader
and the action
.
import type { LoaderFunction, ActionFunction } from "@remix-run/node"; import { json } from "@remix-run/node"; import { Form, useActionData, useLoaderData } from "@remix-run/react"; // Run on a GET request to the route the file represents export const action: ActionFunction = async ({ request }) => { const formData = await request.formData(); let firstName = formData.get("firstName"); if (!firstName){ return json({ error: { firstName: "Name not found!" } }) } // store the data await setName(firstName); // returns json return json({ firstName, }); }; // Run on a GET request to the route the file represents export const loader: LoaderFunction = async ({ request }) => { let firstName = await fetch("https://example.com"); // returns json return json({ firstName, }); } }; export default function Index() { const { firstName } = useLoaderData(); ... }
This heavy use of the Request/Response framework, colocation of server and client functions, and heavy use of the browser have set Remix on a remarkable growth curve. It's not all roses though. Because of React's structure, there's no easy way for Remix/React so selectively update one thing in a component/page/layout. Submitting a form would hit the action
function and refresh all the data covered by that loader
. Some of that can be mitigated through the use of nested layouts, but it's impossible to completely eliminate.
Solid and Solid Start
SolidJs is the OG reactive framework for Javascript/Typescript. Ryan Carniato and the gang have created an entirely different method of rendering web pages, without the use of the VDOM. The reactive framework has a lot of benefits, most notably increased performance, and fine grained control of data fetching and rerendering. To learn more about that, check out this video by Dan Jutan introducing it, and Ryan's blog post on the basics.
Because we have fine grained control of rendering, it lets us selectively update DOM elements whenever data changes. Each Signal, or piece of data, keeps track of which Effects depend on it. If it updates, it will rerun only those Effects and only the DOM elements that need to be updated. This allows us to be more fine grained with our updates, and define a more granular API. One way Solid does this is with server$
. The docs give a good description of how it works, I'll borrow some of their code below:
import server$ from 'solid-start/server' function Component() { // This will only console.log() on the server const logHello = server$(async (message: string) => { console.log(message) }); logHello('Hello') }
Here we define a logHello
function that takes in a message
string and console.log()
s it. The interesting part is the server$()
method, which is compiled and replaced by the Solid compiler. On the client, it subs out the call for a data fetching method. On the server, it adds a data decoding method, and the regular function code.
// Server code import server$ from 'solid-start/server' // COMPILATION OUTPUT on the server server$.registerHandler( '/Log.tsx/logHello', async (message: string) => { console.log(message) } ) const serverFunction1 = server$.createHandler('/Log.tsx/logHello', '#') function Component() { const logHello = serverFunction1; logHello('Hello') } // Client code import server$ from 'solid-start/server' // COMPILATION OUTPUT on the client const serverFunction1 = server$.createFetcher('/Log.tsx/logHello') function Component() { const logHello = serverFunction1; logHello('Hello') }
This is all enabled by the Solid compiler, which generates both the server and client versions. The client call to logHello()
is replaced with a createFetcher('/Log.tsx/logHello', '#')
, whose job it is to encode the arguments and then send those as a POST request to the handler the compiler created for this function. The server version registers the route handler, passes the input args to the function, and serializes the output to be sent to the client(if there were any).
I really love this model. We're able to define the function call in the client, we've got end to end typesafety because it's all generated from one typescript function, and it's colocated with the relevant client logic. If we were so inclined, we could tie this function to a Signal and get a surgical update.
Leptos
I've been watching the Rust frontend framework community with baited breath, looking for something that competes with or beats React. A couple months ago I found Leptos, the most promising Rust framework I've seen yet. It is heavily based on Solid, and beats React in every benchmark, except for bundle size.
Greg Johnston, Leptos' founder, also implemented his own version of Solid's Server Functions. When asked about his inspiration for the feature, he said it was heavily influenced by Solid and Remix. We'll talk about how in a bit. To use a server function, we need to define a function whose args and T
Response type implement serde::Serialize
and serde::Deserialize
. Functions are processed in Leptos using the server
proc macro, which does a similar thing to Solid's compiler above. The macro allows you to specify a function name, the route to place the function at, and the encoding used for serializing and deserializing the results.
#[server(AddTodo, "/api", "Url")] pub async fn add_todo(title: String) -> Result<(), ServerFnError> { let mut conn = db().await?; // fake API delay std::thread::sleep(std::time::Duration::from_millis(1250)); match sqlx::query("INSERT INTO todos (title, completed) VALUES ($1, false)") .bind(title) .execute(&mut conn) .await { Ok(row) => Ok(()), Err(e) => Err(ServerFnError::ServerError(e.to_string())), } }
Inspired by Remix, the default encoding for args when sent to the server is application/x-www-form-urlencoded
and server results are returned as JSON. This makes it easy to use these to process form submissions and return results. If the data involves some large binary objects, or the overhead of serialization/deserialization into JSON is too much, you can choose to use "Cbor"
encoding instead, which should net you some nice performance gains.
Next we have to register our server functions. Unlike JS, Rust won't let us randomly mutate our global objects, we need to have an explicit registration step to setup the route handlers for the server functions.
// Define a function to register them(useful if you have more than one) pub fn register_server_functions() { AddTodo::register(); } // Call the function inside our main function crate::todo::register_server_functions();
If you haven't already, you'll need to make sure the router knows how how to handle the POST requests that are sent to the server. To do that, we add a wildcard route to our backend server.
/// Other routes omitted for brevity. This is for Axum, Actix would be a bit different. let app = Router::new() .route("/api/*fn_name", post(leptos_axum::handle_server_fns));
Then we can call it in our client code as if it was defined locally.
// Creates an Action to synchronize an imperative async function call to the synchronous reactive system. let add_todo = create_server_multi_action::<AddTodo>(cx); view! { cx, <div> //MultiActionForm automates the form handler and the event handlers <MultiActionForm action=add_todo> <label> "Add a Todo" <input type="text" name="title"/> </label> <input type="submit" value="Add"/> </MultiActionForm> <div> }
Much like Solid's compiler, the server
proc_macro makes this possible. Like Solid and tRPC, we get end to end typesafety because it's all generated from one function declaration. There's a bit more boilerplate in the Rust version due to the more stringent borrow checker and ownership semantics, but I don't have to worry about input validation or serialization/deserialization. If you'd like to look at the full example these code samples were derived from, it can be found here in the leptos repo.
Conclusion
Which solution you choose is going to depend heavily on what your needs are and what stack you'd like to use. If it's a greenfield project on a smaller team, Remix or Solid might enable you to build with less boilerplate. If you're a Rustacean like me, Leptos has you covered. And if you're tied into the React/Next.js ecosystem, tRPC might be the library for you.
If you're choosing a solution for a distributed API, one that needs to get data from multiple sources, or your frontend and backend teams are entirely separate, GraphQL or gRPC is a better choice than any RPC framework. They provides an enforcable, versioned schema that is understood by a variety of services and servers in many different languages.
Quit it with the tRPC v GraphQL posts. All that says to me is that you didn't understand the needs of your web app when you built it, or that you built it before tRPC was a thing. It's not a fair comparison.
In a lot of ways, I think the popularity of tRPC comes from solving a problem inherent to Next.js and React. If Next.js were to implement similar RPC functionality as Solid, Leptos, or Remix, I don't see it surviving long term. I found this tweet after I wrote this piece, but it sums things up very nicely.
Reason why I say that is cause I see frameworks starting to build in native typesafe RPC functionality.
— ☃️ Anthony (ajcwebdev.x) ❄️ (@ajcwebdev) November 28, 2022
But I don't think they will build in GraphQL functionality and straight up RPC calls don't entirely replace what you get from GraphQL.