Modern C++ in Embedded Systems

For nearly 35 years I have been working with small processors and there has always been deep divides between practitioners of languages. When writing assembly we ridiculed those using C and when I spent years microcoding we scoffed at everyone. However, nearly all groups continue to wag their heads at the shameful C++ programmers attempting to twist their tools toward the small.

Recent language developments have made C++ the obvious choice for many embedded projects; nevertheless, the toxic environment extends past reddit roasts into poor vendor support of tools and nearly obstructionist chip manufacturers.

This session will use a bare-metal project started in 2018 for a Ciere client as a case study while we work through the decision process for using C++, the acrobatics required to support the language, recent language features that enable goals of size, speed, and expressiveness, and useful libraries.

While the examples will be based on a concrete project, the extracted lessons-learned should be applicable to many embedded projects (bare-metal and small-OS). Attendees will walk away with motivations to use C++ in embedded projects, hints and tips to making tools work, and a sampling of language features and idioms that improve the quality of a final product.


C++Now 2018


Michael Caisse


Original video was published with the Creative Commons Attribution license (reuse allowed).


Original video source:

Additional material for C++ learners: Murach’s C++ Programming C++ in One Hour a Day, Sams Teach Yourself (8th Edition) A Tour of C++ (2nd Edition) (C++ In-Depth Series) C# Programming Illustrated Guide For Beginners & Intermediates: The Future Is Here! Learning By Doing Approach

33 thoughts on “Modern C++ in Embedded Systems

  1. All of this assumes impeccable hardware. I would love to see you grab the debugger when your perfect code doesn't work, because something is either designed badly or broke during the development and you don't know it yet. Your approach is as good as any, but saying you'll never need the debugger is just naive.

  2. There is a lot of points here, but I will skip kind of to the point. He is definitely from one of the two major camps, doing the typical you just don't understand. This goes no where, as this view is shared by both sides. I understand the appeal to both, sort of. However the question I would have for him is, what is time? How does it pertain to a controller versus processor. I have found Linus's answer to C++ to be incredibly spot on. Look most programmers even embedded programmers who commit these sins who are new out of school have heard almost all of this. There are other motivations for people in that world. Cost, time, effort, power, market, reliability, etc.

    The notion of software is by itself an abstraction. Which points at a massive fallacy in all of this. Software conceptually configures hardware. Be it a FPGA or ALU within a CPU architecture. Again most people know this, but seem to forget this. Point being hardware and software do not need to be so complicated. Granted this is more of a control perspective, which is why coming from a processing world you have an issue. However even in the control world this type of software can still work. The danger still remains.

    The issue is the industry really is divided into two worlds. It did not need to be that way, and ultimately this approach is flawed. Its largest component of this flawed state is that it will ultimately fall. Modern programming approaches are actually founded on the past version of programming models. You can do architecture for software, hardware, a number of ways. This is not common knowledge. Objects have issues with interfaces. These languages have expressive issues in terms of functionality and readability. Overall they cheat the future.

    I worked for a boss who have views like this. He ironically was not nearly as superior as he portrayed or believed himself to be.

  3. Hi guys. So this video is a little advanced for me, coming from a web dev background with light backend, I don't know much about mpu's or mcu's or any hardware knowledge. Any good suggestions of where to start? thanks lots

  4. An Embedded system software development company

    We aim to help your existing embedded projects or develop a new project from scratch.

    Embedded system development.

    Embedded system porting.

    Interface low level with middleware.

    Embedded system optimizing.

    Middleware development & optimizing.

    Interface application with middleware.

    Regression testing.

    Finally documenting the project

    Click Here:

  5. Regarding abstraction vs. performance/efficiency: Chips today are so cheap and powerful that if I am at the limit of performance, where I would need to worry about abstraction performance cost, i just use a faster chip…

  6. Problem with C++ abstractions for embedded is will constantly go look at code generation after optimization to make sure the code stays deterministic (some constructs introduce possible heap allocations, but in some cases that might be optimized away – but in embedded one needs to know exactly what the code is doing and that it will stay well deterministically bounded in behavior. C++ just has so much layered abstraction that is difficult to know without explicit verification – it's hard to reason about the behavior from just the high level source code. In other context this doesn't matter so much. But C++ imposes its own kind of mental overhead for embedded development context

  7. I believe he answers his own question. What do I need to use cpp in embedded? Well 5-10 years of experience with C, assemblers and different compilers on various platforms to know what to look for and how the CPP code gets compiled using 1 specific compiler on 1 specific platform. And this is the very reason CPP is bullshit.

    Certainly he is right, manufacturers do not support CPP well. But there are many reasons for it.
    One is customer need and that means: CPP is very object oriented and on embedded devices the only place where you really need/instanciate objects is communication.
    Anywhere else everything is static, you gain nothing out of OO perspectice, you do not need a complicated language that god knows how it gets compiled and what subset can be activated AND PORTED.
    It is already hard to design a project that relies on portable C code, not just architecture portability but compiler portability in mind. Cross compilation already sucks without CPP.

    A C dev talking here but I guessed you figured this by now.

  8. Who has written the link between the standard template library and the hardware data direction registers and data registers? Instruction about strategies for doing this would certainly be welcome. Until that happens, however, your speech has little meaning and less application to embedded programming.

  9. This looks nice because it wasn't put together by a EE… Oh by the way, we haven't been able to use this in 40 days because we don't want to change some resistors… None of those problems happen with a good EE 🙂

  10. i often use the debugger to check what the code is actually doing – cause most of the time i have to make adjustments to code that nobody knows who has written at what time with what reason in mind. It helps verify your assumptions to see what is actually going on – like i had problems once were i was expecting some results from a database but never even got to that part of the code.
    I could now either painfully look through the whole program (rather large) and reconstruct the whole call-history (c++, sql, java [with reflections]) or throw on the debugger and look at what is happening there.
    Turns out the hard-coded the object-transformation to look on specific values and depending on those convert just certain parts of the object under the assumption that the left-out parts were not used. When i later asked that developer about that: "Because that is faster" – on a user-interaction workflow where the "faster" was not having to assign a few integers and string.

  11. Most of the time when the compiler comes up with a bug, 90% of the time it points to something completely different. Always read the manual and understand how its meant to work, hacking in C++ (ie. change it until it works) is never going to end well.

  12. This might be a naive question, but how do you know that your abstraction is really zero-cost (without reimplementing your algorithms manually and comparing the ASM output, which would of course defeat the purpose)?

Leave a Reply

Your email address will not be published. Required fields are marked *