Welcome to GigaElixir Gazette, your 5-minute digest of Elixir ecosystem news that actually matters.

This week: Oban Pro rethinks workflows and rate limiting, SIMD-powered JSON parsing exposes massive memory inefficiencies, and we look at how Elixir is pushing deeper into production systems with runtime tracing and UI-driven AI tooling.

. WEEKLY PICKS .

Torque Decodes JSON 2.3x Faster Than Jason Using SIMD and Rust NIFs

Torque wraps sonic-rs via Rustler NIFs and targets the JSON hot path. On a 1.2 KB OpenRTB payload (Apple M2 Pro, OTP 28): 255.8K decode ops/s at 3.91 us mean versus Jason's 109.1K at 9.17 us. Encoding is where it gets absurd - 64 bytes per encode versus JSON’s 3,912 bytes. That is a 61x memory difference on every single encode call.

Selective field extraction via JSON Pointer means you can pull specific fields from large payloads without parsing the entire document. Automatic dirty CPU scheduler dispatch for large inputs keeps your scheduler threads clean. Precompiled binaries ship for common targets, but if you want AVX2 or AVX512, build from source with TORQUE_BUILD=true.

🔥 Oban Pro 1.7 Ships Sub-Workflows, Three Rate Limiting Algorithms

The biggest Oban Pro release in a while. Workflows get a dedicated tracking table with database triggers replacing expensive aggregation queries. Sub-workflows and context sharing let you compose job pipelines without the M*N scaling problem that plagued shared dependencies - benchmarks show 2x faster execution, 7x fewer buffer hits, and 15x fewer index scans.

Rate limiting now supports sliding window, fixed window, and token bucket algorithms with variable job weights. The new RateLimit module lets you check capacity and consume quota outside of job execution. Chunks get pre-computed IDs for faster lookups and snooze support for selective retry. Generated columns are gone - replaced with expression indexes that eliminate table-locking migrations.

🎯 Phoenix Scopes Move Authorization Logic Where It Belongs

Scattered permission checks across plugs and controllers is the default Phoenix pattern, and it is wrong. Curiosum's article on Phoenix Scopes shows how to structure authorization closer to your domain - scoped contexts that carry permission logic with them instead of sprinkling guards everywhere.

The approach integrates with Permit.Phoenix for declarative authorization rules. Instead of checking permissions at every controller action, scopes define what a user can access at the context level. The permission logic lives next to the data it protects, not three layers above it in a plug pipeline.

🚀 MCP Apps Ship Interactive UIs Inside Claude and VS Code from Elixir

ConduitMCP v0.9.0 lets your MCP tools return live HTML widgets rendered directly inside AI host conversations. The pattern: your tool declares a ui:// resource, the host fetches it, renders it in a sandboxed iframe, and the iframe communicates back via JSON-RPC over postMessage. Dashboards, forms, charts - all inside the chat window.

The Elixir DSL makes this dead simple. A ui/1 macro links tools to HTML resources, app/2 registers both in one declaration, and app_html/1 handles MIME types. Your tool returns BEAM system metrics, the UI renders a live dashboard that can call back into server tools for fresh data. Zero client-side build step, zero React, just HTML and the BEAM.

💡 Ruby Developers Face the Speed Wall - Elixir and Crystal Offer Different Exits

Every Ruby developer eventually hits the performance ceiling. Thousands of WebSocket connections eat RAM. Batch jobs crawl through millions of rows. Two languages share Ruby's syntax DNA but solve different problems. Elixir runs on the BEAM and handles millions of concurrent processes with fault tolerance baked in. Crystal compiles to native code and runs 10x faster than Ruby for CPU-bound work.

The choice is architectural: Elixir for real-time concurrency (chat, live dashboards, IoT), Crystal for raw computation speed (data processing, CLI tools). Phoenix handles what Rails cannot - millions of persistent connections on a single server. Crystal handles what Ruby cannot - compiled performance without learning Rust or Go syntax.

💡 Pro Tip

You Don't Have Tracing Installed? Load It at Runtime

Lars Wikman at Underjord solved the most annoying tracing problem in the BEAM ecosystem: you never have your tracing library installed when you actually need it.

Entrace now ships a Mini script you can paste into a remote IEx console for instant tracing. Need the full-featured version? A one-liner downloads pre-compiled .beam files for your OTP version (27 and 28 supported), loads them into the running system, and you are tracing functions in seconds. Zero restarts, zero deploys.

The second project, entrace_opentelemetry, bridges Erlang tracing with OpenTelemetry spans.

At runtime, ask your app to trace a specific query - get timing, input values, output values, error information as OTel spans in Sentry or Honeycomb. Want to trace a function only when a particular argument is nil? Throw the match spec into your remote IEx console and watch production spit new spans.

The overhead is mild because Erlang tracing is a VM primitive, not an instrumentation layer bolted on top.

Remember for hot-loaded tracing:

  • Entrace.Mini gives you simplified tracing as a single pasteable module for any remote IEx console - zero dependencies required

  • Pre-compiled .beam downloads let you load full Entrace into a running OTP 27/28 system without restarting or redeploying

  • entrace_opentelemetry converts Erlang trace events into OpenTelemetry spans - runtime-injectable observability for any function

  • Match specs let you conditionally trace specific argument patterns in production without recompiling or restarting the application

. TIRED OF DEVOPS HEADACHES? .

Deploy your next Elixir app hassle-free with Gigalixir and focus more on coding, less on ops.

We're specifically designed to support all the features that make Elixir special, so you can keep building amazing things without becoming a DevOps expert.

See you next week,

Michael

P.S. Forward this to a friend who loves Elixir as much as you do 💜

Keep Reading