Where I share about my journey in tech and life.
This article is writen manually and uses ai in order to check the spelling.
A Few Notes From Migrating From RMQ to gRPC/Protobuf
For the past few weeks, I’ve been migrating part of our architecture from RabbitMQ messages to gRPC with Protobuf contracts. The original goal was simple: stronger contracts between services, generated clients, generated models, fewer serialization mistakes, and less “hope-driven development” between Python and C# services. Conceptually, it makes perfect sense. In practice… it was a bit more painful than expected.
Our previous communication layer was based on RMQ messages with manually maintained models.
This works fine at the beginning. Until:
At some point, you realize that half the bugs are not business logic bugs anymore. They are communication bugs.
So Protobuf looked like the obvious solution, the tooling experience was... interesting.
The very first thing I learned is that protoc really does not like helping you.
One of my first notes during the migration was:
that shit cannot generate an output folder...
Which honestly summarizes the experience pretty well.
You need to generate:
.pyi stubs,Even an empty .proto file somehow manages to generate a surprising amount of files.
At first I tried using the standalone protoc binary directly:
protoc ./main.proto ./filter/service.proto ./filter/model.proto \
--python_out ./python \
--pyi_out ./python \
--csharp_out c_sharp
Which worked.
Then I switched to the Python module version:
python -m grpc_tools.protoc ...
And immediately everything broke.
The error message I got was honestly one of the best parts of the experience:
File does not reside within any path specified using --proto_path.
Note that the proto_path must be an exact prefix of the .proto file names
-- protoc is too dumb to figure out when two paths are equivalent
(it's harder than you think).
Apparently grpc_tools.protoc requires explicit -I paths while the standalone protoc was more permissive in my setup.
So the correct command became:
./venv/bin/python -m grpc_tools.protoc \
-I ./protos \
--python_out=./test \
--pyi_out=./test \
--grpc_python_out=./test \
protos/filter.proto
Which finally generated:
| file name | content |
| :--- | :--- |
*_pb2.py | Contains the Protobuf message descriptors and runtime serialization definitions used internally by Protobuf
*_pb2.pyi | This file contains the typing information. (Useful for autocomplete and static analysis.)
*_pb2_grpc.py | This one contains the generated gRPC client/server logic.
Then came the real issue.
The generated gRPC Python file imports the generated Protobuf module like this:
import filter_pb2 as filter__pb2
Which immediately broke in my project structure:
ModuleNotFoundError: No module named 'filter_pb2'
Because the generated files lived inside a Python module, but the server was started from the parent directory.
Classic Python import chaos.
The first StackOverflow thread I found — from 2020 — suggested modifying the generated code manually:
import .filter_pb2 as filter__pb2
Which is invalid syntax.
The actual fix is:
from . import filter_pb2
But this raises a more important question:
Why am I modifying generated code?
Generated code is supposed to be disposable.
If regeneration breaks your fixes, something is wrong with the workflow.
I eventually found projects like:
Which exist almost entirely to patch generated imports.
Being native in French, the package name makles me laugh a lot.
Another interesting discovery:
python -m grpc_tools.protoc --version
# libprotoc 31.1
protoc --version
# libprotoc 33.4
So the Python tooling and the standalone tooling were not even using the same Protobuf version. Very reassuring...
Ironically, the C# experience was much cleaner.
Basic generation worked immediately:
protoc -I../protos --csharp_out . ../protos/filter.proto
And one very important detail:
The package declaration inside the .proto file becomes the generated C# namespace.
Which is actually very convenient once you understand it.
But the real interesting part was discovering that .NET can generate everything automatically during build time.
You simply add this to the .csproj:
<ItemGroup>
<Protobuf Include="../../protos/*.proto" GrpcServices="Client" />
</ItemGroup>
Honestly, this felt a bit magical after the Python experience.
Another confusing thing:
grpc_tools.protoc can generate Python gRPC stubs with:
--grpc_python_out
But it cannot generate the C# gRPC side because:
protoc-gen-csharp: program not found or is not executable
Which initially confused me because modern protoc already includes C# support.
So depending on:
protoc,you get different behaviors and different supported generators.
This ecosystem feels heavily fragmented.
After spending time with it, I understand why people love gRPC and Protobuf.
The core idea is excellent.
This is exactly what I was looking for when starting the migration. But the tooling ecosystem feels surprisingly rough around the edges.
There are multiple ways to do the same thing: And sometimes it feels like every stack combination has its own unofficial workaround culture. Still, despite all the friction, I think the migration is worth it.