Skip to main content

Allocations in Rust: Compiler optimizations

· 4 min read

Up to this point, we've been discussing memory usage in the Rust language by focusing on simple rules that are mostly right for small chunks of code. We've spent time showing how those rules work themselves out in practice, and become familiar with reading the assembly code needed to see each memory type (global, stack, heap) in action.

Throughout the series so far, we've put a handicap on the code. In the name of consistent and understandable results, we've asked the compiler to pretty please leave the training wheels on. Now is the time where we throw out all the rules and take off the kid gloves. As it turns out, both the Rust compiler and the LLVM optimizers are incredibly sophisticated, and we'll step back and let them do their job.

Allocations in Rust: Dynamic memory

· 6 min read
Bradlee Speice

Managing dynamic memory is hard. Some languages assume users will do it themselves (C, C++), and some languages go to extreme lengths to protect users from themselves (Java, Python). In Rust, how the language uses dynamic memory (also referred to as the heap) is a system called ownership. And as the docs mention, ownership is Rust's most unique feature.

The heap is used in two situations; when the compiler is unable to predict either the total size of memory needed, or how long the memory is needed for, it allocates space in the heap.

Allocations in Rust: Fixed memory

· 16 min read
Bradlee Speice

const and static are perfectly fine, but it's relatively rare that we know at compile-time about either values or references that will be the same for the duration of our program. Put another way, it's not often the case that either you or your compiler knows how much memory your entire program will ever need.

However, there are still some optimizations the compiler can do if it knows how much memory individual functions will need. Specifically, the compiler can make use of "stack" memory (as opposed to "heap" memory) which can be managed far faster in both the short- and long-term.

Allocations in Rust: Global memory

· 8 min read
Bradlee Speice

The first memory type we'll look at is pretty special: when Rust can prove that a value is fixed for the life of a program (const), and when a reference is unique for the life of a program (static as a declaration, not 'static as a lifetime), we can make use of global memory. This special section of data is embedded directly in the program binary so that variables are ready to go once the program loads; no additional computation is necessary.

Understanding the value/reference distinction is important for reasons we'll go into below, and while the full specification for these two keywords is available, we'll take a hands-on approach to the topic.

Allocations in Rust: Foreword

· 4 min read
Bradlee Speice

There's an alchemy of distilling complex technical topics into articles and videos that change the way programmers see the tools they interact with on a regular basis. I knew what a linker was, but there's a staggering amount of complexity in between the OS and main(). Rust programmers use the Box type all the time, but there's a rich history of the Rust language itself wrapped up in how special it is.

In a similar vein, this series attempts to look at code and understand how memory is used; the complex choreography of operating system, compiler, and program that frees you to focus on functionality far-flung from frivolous book-keeping. The Rust compiler relieves a great deal of the cognitive burden associated with memory management, but we're going to step into its world for a while.

Let's learn a bit about memory in Rust.

QADAPT - debug_assert! for allocations

· 5 min read
Bradlee Speice

I think it's part of the human condition to ignore perfectly good advice when it comes our way. A bit over a month ago, I was dispensing sage wisdom for the ages:

I had a really great idea: build a custom allocator that allows you to track your own allocations. I gave it a shot, but learned very quickly: never write your own allocator.

-- me

I proceeded to ignore it, because we never really learn from our mistakes.

More "what companies really mean"

· 2 min read
Bradlee Speice

I recently stumbled across a phenomenal small article entitled What Startups Really Mean By "Why Should We Hire You?". Having been interviewed by smaller companies (though not exactly startups), the questions and subtexts are the same. There's often a question behind the question that you're actually trying to answer, and I wish I spotted the nuance earlier in my career.

Let me also make note of one more question/euphemism I've come across:

A case study in heaptrack

· 5 min read
Bradlee Speice

I remember early in my career someone joking that:

Programmers have it too easy these days. They should learn to develop in low memory environments and be more efficient.

...though it's not like the first code I wrote was for a graphing calculator packing a whole 24KB of RAM.

But the principle remains: be efficient with the resources you have, because what Intel giveth, Microsoft taketh away.

Isomorphic desktop apps with Rust

· 10 min read
Bradlee Speice

I both despise Javascript and am stunned by its success doing some really cool things. It's this duality that's led me to a couple of (very) late nights over the past weeks trying to reconcile myself as I bootstrap a simple desktop application.

Primitives in Rust are weird (and cool)

· 7 min read
Bradlee Speice

I wrote a really small Rust program a while back because I was curious. I was 100% convinced it couldn't possibly run:

fn main() {
println!("{}", 8.to_string())
}

And to my complete befuddlement, it compiled, ran, and produced a completely sensible output.