Yacine's Bloje

Fetch and build Chromium

· Yacine Sellami

Minimum hardware requirements.

Windows prerequisites

Visual Studio (required)

Chromium requires Visual Studio 2022 >= 17.0.0. You must install the following:

Table of contents

IDE recommendation

VS Code is the goto recommendation for development on chromium but due the nature of its bloated architecture and using electron doesn’t help when we’re shaving off resources and maximizing stability to work on a very large code base, my recommendation is ZED for the sake of performance and minimalism it also is compatible enough with all the requirements we need.

Get ZED

Clangd configuration.

Clangd is the language server for C/C++ that provides features like code completion,diagnostics the usual. make sure to use chromium shipped clangd binary located at if you do not you will encounter the terrors of precompiled header (PCH) errors due version mismatch.

src/third_party/llvm-build/Release+Asserts/bin/clangd

verify if it exists.

ls third_party/llvm-build/Release+Asserts/bin/clangd

if it doesn’t you can always obtain/update it.

tools/clang/scripts/update.py

whether you are using ZED or not you will have to set your clangd binary/command to the one mentioned previously.

We will be using this config, it is basically telling clangd how to behave sanely/reliably with a huge repo like chromium.

Diagnostics: It turns off warnings about unused includes and missing includes. Chromium code relies heavily on transitive includes and build-system magic, so those warnings are mostly useless or bs noise and slow you down.

CompileFlags / Remove: This strips out GN/Ninja flags that clangd doesn’t understand or should not care about. If you leave them in, clangd gets confused, indexing breaks, or diagnostics become wrong and memory bloats. Removing them keeps clangd’s view of the code clean and stable.

Global rule (PathMatch: “.*”): By default, background indexing is disabled. This stops clangd from trying to index the entire Chromium tree, which would eat CPU/RAM/DISK and make your editor lag or crash (zed is a good pick).

Second rule Now depending on what you are working on but this is a general approach: For important core folders (base, net, content, blink, skia, etc.), background indexing is turned back on. These are the files you actually navigate and refactor, so indexing is worth it there.

TLDR: Less noise, fewer fake errors, no full-repo indexing meltdown, fast and stable editor, and full clangd features only where they matter.

Diagnostics:
  UnusedIncludes: None
  MissingIncludes: None
CompileFlags:
  Remove: [-cfg=*, -exec_root=*, -inputs=*, -DUNSAFE_BUFFERS_BUILD,"--warning-suppression-mappings=*"]


If:
  PathMatch: ".*"
Index:
  Background: Skip
---
If:
  PathMatch: '(^|.*/)(base|url|content|mojo|net|storage|viz|cc|gpu)(/.*)?$|(^|.*/)services(/network(/.*)?)?$|(^|.*/)ui/events(/.*)?$|(^|.*/)third_party/blink/renderer/(core|platform|bindings)(/.*)?$|(^|.*/)third_party/skia(/.*)?$'
Index:
  Background: Build

Next you will require compile commands for clangd to work properly. This is done by exporting compile commands from gn which is mentioned in the Compiling section but we’ll get there.

Create a working folder and get depot_tools

Pick a location for your Chromium workdir:

mkdir C:\chromium_fork
cd C:\chromium_fork
git clone https://chromium.googlesource.com/chromium/tools/depot_tools

For linux do the same inside your $HOME directory.

Add depot_tools to PATH

Windows

set "PATH=C:\chromium_fork\depot_tools;%PATH%"

Persist it for future sessions:

setx PATH "C:\chromium_fork\depot_tools;%PATH%"

Linux for bash

export PATH="$HOME/chromium_fork/depot_tools:$PATH"
echo 'export PATH="$HOME/chromium_fork/depot_tools:$PATH"' >> ~/.bashrc
source ~/.bashrc

Linux for zsh

export PATH="$HOME/chromium_fork/depot_tools:$PATH"

Persist it for future sessions.

echo 'export PATH="$HOME/chromium_fork/depot_tools:$PATH"' >> ~/.zshrc
source ~/.zshrc

Enable git cache

Git cache reduces redundant network fetches and avoids repeated downloads across syncs. It’s most useful on slow or unreliable networks, but it’s still a good long-term setup. The trade-off is extra disk usage.

Windows

set "GIT_CACHE_PATH=%USERPROFILE%\.git-cache"
mkdir "%GIT_CACHE_PATH%"
setx GIT_CACHE_PATH "%GIT_CACHE_PATH%"

Linux

mkdir -p "$HOME/.git-cache"
echo 'export GIT_CACHE_PATH="$HOME/.git-cache"' >> "$HOME/.profile"
export GIT_CACHE_PATH="$HOME/.git-cache

If you don’t want depot_tools to auto-update:

Windows

setx DEPOT_TOOLS_UPDATE 0

Linux

echo 'export DEPOT_TOOLS_UPDATE=0' >> "$HOME/.profile"
export DEPOT_TOOLS_UPDATE=0

Prepare gclient

gclient is the depot_tools wrapper that manages Chromium’s multi-repo dependencies (based on .gclient and DEPS).

Windows:

mkdir chromium
cd chromium
fetch chromium
cd src

Linux

mkdir -p "$HOME/chromium"
cd "$HOME/chromium"
fetch --nohooks chromium
cd src
./build/install-build-deps.sh
gclient runhooks

When fetch finishes, you’ll have a src/ directory and a .gclient file in the parent directory.

Compiling

Let’s start by entering the src directory

$ cd src

Switching to specific tags (versions of chromium).

If you want your local Chromium checkout to match a version that’s actually shipped, switch to a release tag (the same kind of version you see in chrome://version, e.g. 143.0.7499.170). Chromium publishes these tags in its source repo.

Run the following from your src directory


# Ensures all Chromium dependencies like the repos referenced by DEPS are present 
# and updated, and also makes sure branch heads and tags are available across the repos.
# This reduces missing refs and tags issues when you move across multiple releases.

$ gclient sync --with_branch_heads --with_tags

# Updates your local Git repo with the latest tags from the remote 
#(so 143.0.7499.170 actually exists locally).

$ git fetch --tags

# Creates (or resets) a local branch named my_local_143.0.7499.170 to point exactly at the tag 143.0.7499.170. 
# Using a branch (instead of staying detached on the tag) is important if you plan to commit your own changes 
# but this guide doesn't assume such thing.
$ git checkout -B my_local_143.0.7499.170 143.0.7499.170

# After changing the Chromium tag, re-syncs dependencies so they match the DEPS file at that exact tag. 
# This is the step that keeps your checkout consistent and buildable.

$ gclient sync --with_branch_heads --with_tags

Next we pick a name for our build directory, i use out/Default as its a common choice.

this command will run gn, a meta-build system that generates NinjaBuild files.

$ gn gen out/Default

(optional) then for clangd we export compile commands

gn gen out/Debug --export-compile-commands

set it up inside your ide config that runs clangd

"--compile-commands-dir=/mnt/ssd/chromium/src/out/Debug"
``̀

Edit args stored in args.gn with the applicable arguments to your desired build, located at out/Default/args.gn

these are the commonly used ones, you can find more about them by digging in chromium code base at https://chromium.googlesource.com/

```bash
is_debug = false
is_component_build = true

# Possible values are 0 | 1 | 2
symbol_level = 0
blink_symbol_level=0
v8_symbol_level=0

# Possible values are "x64" | "x86" | "arm"
target_cpu = "x64" 

# Limit memory-heavy links 
concurrent_links = 1
enable_vulkan = true
enable_swiftshader_vulkan = true

Keep in mind that when is_debug=true, compile times are usually faster because the compiler skips most optimization passes. Build speed is also heavily affected by symbol level.

The same idea applies to is_component_build=true: linking is generally faster because most code is built as shared libraries instead of one large fat binary.

Run this command to compile chromium

$ autoninja -C out/Default chrome

Notes about compiling and performance.

Having windows defender or any AV/middleware that watches files will hinder compile speed. If your system is too weak but can sustain high load for a very long time (no crash) you can reduce the number of jobs running.

$ autoninja -C out/Default -j 24 chrome

if you would like results summary post compiling you can set this env variables prior to compiling.

$ set NINJA_SUMMARIZE_BUILD=1

And the output will be the following.

$ autoninja -C out\Default base
Longest build steps:
       0.1 weighted s to build obj/base/base/trace_log.obj (6.7 s elapsed time)
       0.2 weighted s to build nasm.exe, nasm.exe.pdb (0.2 s elapsed time)
       0.3 weighted s to build obj/base/base/win_util.obj (12.4 s elapsed time)
       1.2 weighted s to build base.dll, base.dll.lib (1.2 s elapsed time)
Time by build-step type:
       0.0 s weighted time to generate 6 .lib files (0.3 s elapsed time sum)
       0.1 s weighted time to generate 25 .stamp files (1.2 s elapsed time sum)
       0.2 s weighted time to generate 20 .o files (2.8 s elapsed time sum)
       1.7 s weighted time to generate 4 PEFile (linking) files (2.0 s elapsed
time sum)
      23.9 s weighted time to generate 770 .obj files (974.8 s elapsed time sum)
26.1 s weighted time (982.9 s elapsed time sum, 37.7x parallelism)
839 build steps completed, average of 32.17/s