• 0 Posts
  • 99 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle
  • That’s a valid point, the dev cycle is compressed now and customer expectations are low.

    So instead of putting in the long term effort to deliver and support a quality product, something that should have been considered a beta is just shipped and called “good enough”.

    A good example I guess would be a long term embedded OSS project like Tasmota, compared to the barely functional firmware that comes stock on the devices that people buy to reflash to Tasmota.

    Still there are few things that frustrate me like some Bluetooth device that really shouldn’t have been a Bluetooth device, and has non-deterministic behaviour due to lack of initialization or some other trivial fault. Why did the tractor work lights turn on as purple today? Nobody knows!


  • My type is a dying breed too, the guys who do their best to write robust code and actually trying to consider edge cases, race conditions, properly sized variables and efficient use of cycles, all the things that embedded guys have done as “embedded” evolved from 6800 to Pic, Atmel and then ESP platforms.

    Now people seem to have embraced “move fast and break things” but that’s the exact opposite to how embedded is supposed to be done. Don’t get me wrong there is some great ESP code out there but there’s also a shitload of buggy and poorly documented libraries and devices that require far too many power cycles to keep functioning.

    In my opinion one power cycle is too many in the embedded world. Your code should not leak memory. We grew up with BYTES of RAM to use, memory leaks were unthinkable!

    And don’t get me started on the appalling mess that modern engineers can make with functional block inside a PLC, or their seeming lack of knowledge of industrial control standards that have existed since before the PLC.













  • I love the term “write-only code”, it’s perfect. I used to love Perl as it felt like it flowed straight from my brain into the keyboard. What a free and magical language.

    So it turned out I had ADHD. Took meds, went back to C/++ with renewed appreciation, haven’t touched Perl since as it horrifies me to look at it. What a nightmare of dangling references and questionable typing. Any language that allows you to cast a string to a function and call it really needs to sit down and think about what it’s doing.


  • Aspartame itself is completely safe, but recent studies have found all artificial sweeteners have metabolic effects. It’s not that the chemicals themselves are hazardous, as you say aspartame in particular has been very rigorously studied, but that it appears the body uses the sweet taste as a signal to change insulin production.

    Is it better to have a diet pop than sugar pop? Definitely and I prefer Diet Coke these days, Classic feels like drinking syrup.

    However it’s even healthier just to drink water, or non-caloric, unsweetened drinks like coffee or tea. Soft drinks are supposed to be a treat and not a food group, I drink maybe one a week.


  • If you don’t want memory-safe buffer overruns, don’t write C/C++.

    Fixed further?

    It’s perfectly possible to write C++ code that won’t fall prey to buffer overruns. C is a lot harder. However yes it’s far from memory safe, you can still do stupid things with pointers and freed memory if you want to.

    I’ll admit as I grew up with C I still have a love for some of its oh so simple features like structs. For embedded work, give me a packed struct over complex serialization libraries any day.

    I tend to write a hybrid of the two languages for my own projects, and I’ll be honest I’ve forgotten where exactly the line lies between them.



  • A million tiny decisions can be just as damaging. In my limited experience with several different local and cloud models you have to review basically all output as it can confidently introduce small errors. Often code will compile and run, but it has small errors that can cause output to drift, or the aforementioned long-run overflow type errors.

    Those are the errors that junior or lazy coders will never notice and walk away from, causing hard to diagnose failure down the road. And the code “looks fine” so reviewers would need to really go over it with a fine toothed comb, which only happens in critical industries.

    I will only use AI to write comments and documentation blocks and to get jumping off points for algorithms I don’t keep in my head. (“Write a function to sort this array”) It’s better than stack exchange for that IMO.


  • I tried using AI tools to do some cleanup and refactoring of some legacy embedded C code and was curious if it could do any optimization or knew any clever algorithms.

    It’s pretty good at figuring out the function of the code and adding comments, it did some decent refactoring of some sections to make them more readable.

    It has no clue about how to work in a resource constrained environment or about the main concepts that separate embedded from everything else. Namely that it has to be able to run “forever”, operate in realtime on a constant flow of sensor data, and that nobody else is taking care of your memory management.

    It even explained to me that we could do input filtering by using big arrays to do simple averaging on a device with only 1kB RAM, or use a long long for a never-reset accumulator without worrying about what will happen because “it will be years before it overflows”.

    AI buddy, some of these units have run for decades without a power cycle. If lazy coders start dumping AI output into embedded systems the whole world is going to get a lot more glitchy.