Go vs Rust vs Bun vs Node | Prime Reacts

preview_player
Показать описание
Recorded live on twitch, GET IN

### The Author and Article

MY MAIN YT CHANNEL: Has well edited engineering videos

Discord

Hey I am sponsored by Turso, an edge database. I think they are pretty neet. Give them a try for free and if you want you can get a decent amount off (the free tier is the best (better than planetscale or any other))
Рекомендации по теме
Комментарии
Автор

Rust has electrolytes, its what plants crave

Ataraxia_Atom
Автор

I conducted tests on two servers, each equipped with 1Gbit bandwidth and running Ubuntu 22.04. My tests focused primarily on Rust and Go, though I also looked at Bun and Node in single-threaded modes. I adjusted the ulimit -n setting and the responses were approximately 1kb in JSON format, simulating a typical production response.

For the production server(16 cores), I used a local test with the following settings: wrk -t10 -c4000 -d10s. The results were:

Rust (using warp) achieved 950K requests/s at 2200% CPU utilization.
Rust (using axum) achieved 850K requests/s at 2200% CPU utilization.
Go reached 400k requests/s with a CPU utilization of 2400%.
Bun, in single-threaded mode, achieved 120k requests/s at 100% CPU utilization.
When running 10 Bun instances on different ports, the combined score was 950k requests/s at 1000% CPU utilization.(runned 10 wrk -t1 -c400 -d10s for different port)
Node, in single-threaded mode, registered 34k requests/s at 100% CPU utilization, but with 10 threads, it hit 300k requests/s at 1000% CPU utilization.


For the second test, wrk was executed on the other server, the results were:

Go managed 128k requests/s with 900% CPU utilization.
Rust (warp) matched this with 128k requests/s but at a lower 420% CPU usage.
Rust (axum) also achieved 128k requests/s, consuming 500% of the CPU.
Bun maintained its performance at 120k requests/s in single-threaded mode with 100% CPU utilization.
Node in single-threaded mode had a slightly decreased performance of 30k requests/s at 100% CPU utilization.
In conclusion, bandwidth limitations were evident in the second test. Impressively, Bun showcased exceptional requests handling capabilities.

Also, the preceding text was structured with the help of ChatGPT.(and this one)
I'm working on a second test using some database inserts/reads

catalinstochita
Автор

It's hilarious because Python devs would be more than happy to get the numbers node got

marhoonothoja
Автор

I think the next step is to JSON stringify a result that was awaited from a DB. Same DB for all tests, but it’d test more of the concurrency system.

EvanBoldt
Автор

we need original primeagen to do rust vs go vs bun vs node!!!

muhammadsalmanafzal
Автор

Latency is important as well. GC does not run on every request. So, there would be some requests with very long responses because they waited for GC.

nekony
Автор

Until Bun appeared, I straight up refused to build any backend in Javascript after learning about Go and Rust. Tbh, I'm still not a huge fan of JS because of the problems with the language itself, but I feel more comfortable now going to sit at the soy dev table with the React Andys where they keep all the money.

andythedishwasher
Автор

Go is the language. Fast enough, simple enough. Everything else is a madhouse these days.

nonefvnfvnjnjnjevjenjvonej
Автор

Yo. The Article didnt build bun properly (i think). By default bun does --target=browser and not --target=bun, which would be the intended thing for a serverside usecase. But the results still make sense in general.

Darkitz
Автор

I do not understand why Gary says "I feel bad for node" - Node and Bun were doing pretty good and he is comparing them to a system-level languages. Let's see what you will get from languages in the same category like php, ruby, pyhton or even Java and c#....

jaroslavhuss
Автор

Go uses multiple threads to schedule goroutines at runtime, bun runs in a single thread, and to make these tests comparable we need to run bun service per cpu core and use http balancer, the same for node

shadowfaxenator
Автор

I'll let my boss know that this is why my code is slow... obviously if I just rewrite it in a good language it'll be better.

Kane
Автор

That really looks like an upload / download speed limitation. They were all coming in right around 3MB/s transfer when running in Linode. It's pretty unbelievable the bottleneck is the server or Linode's network. If that is a limitation from Linode, then it'd be better to test against AWS or GCP.

ra_benton
Автор

I may write an article like this comparing package install times between 'bun add' and 'npm install'. Those are some sexy numbers.

andythedishwasher
Автор

One of the biggest issues with this test is that it doesn’t consider any actual server-side processing. It’s kinda assuming your service is just a data gateway and nothing more.

Sure, Bun is fast because it’s internal libraries are written with Zig, but what happens when you have your own application code? You fallback to the slow V8.

DynamicalisBlue
Автор

multi dollar companies running node wondering what this guy is talking about.

ksomeone
Автор

That was the coldest hot take I've ever seen

carlosmspk
Автор

Using zig for a backend instead. Why ? Resources. 4mb for the container (including all assets), and 20mb memory footprint under load … that’s app + threaded http server.

Requests all served under 1ms

Throughput not a requirement for my use case (I don’t need 128k req per sec), but I do care about server costs, because I pay for them

steveoc
Автор

I would like so much that prime understand Portuguese to show him "the fight" that we have in our Brazilian community. Basically build a http server in any language that could support concurrent connections to the database. You would love it.

lucasoliveira-xsyh
Автор

Tbh Axum is rather a feature rich and convenient for development framework than the fastest. The tester should use something better, or use gin in go and nestjs in node and bun, to make things fair

kirillgimranov
welcome to shbcf.ru