Did you ever saw a char and thought: “Damn, 1 byte for a single char is pretty darn inefficient”? No? Well I did. So what I decided to do instead is to pack 5 chars, convert each char to a 2 digit integer and then concat those 5 2 digit ints together into one big unsigned int and boom, I saved 5 chars using only 4 instead of 5 bytes. The reason this works is, because one unsigned int is a ten digit long number and so I can save one char using 2 digits. In theory you could save 32 different chars using this technique (the first two digits of an unsigned int are 42 and if you dont want to account for a possible 0 in the beginning you end up with 32 chars). If you would decide to use all 10 digits you could save exactly 3 chars. Why should anyone do that? Idk. Is it way to much work to be useful? Yes. Was it funny? Yes.

Anyone whos interested in the code: Heres how I did it in C: https://pastebin.com/hDeHijX6

Yes I know, the code is probably bad, but I do not care. It was just a funny useless idea I had.

  • Saleh@feddit.org
    link
    fedilink
    arrow-up
    8
    ·
    8 days ago

    I feel like many programmers (or their management) have grown ignorant to resource limitations over the past decade or so.

    Obviously there is good examples like many linux distros running well on 4GB RAM and the like, but when it comes to windows, websites and proprietary programs, they gobble up insane amounts of RAM to provide almost the same functionality as in 2010.

    • UnPassive@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      I agree 100%! Butt I’m joking about a façade of optimization. Making code confusing and hard to interface with by making up custom data types. And for more context, their main project is a UI that takes >10 seconds to load and uses 2+GB of RAM. But at least the UUIDs in the SQLite DB are stored as hex instead of strings 😅 (even though I think everything in SQLite is actually stored as a string under the hood?)

      I do still admire the desire for optimization - but it might be some sort of coping mechanism to ignore the insanely unoptimized bits of the project

    • Hoimo@ani.social
      link
      fedilink
      arrow-up
      7
      ·
      8 days ago

      It’s just not on their radar at all these days. You want to develop and iterate quickly, so you’re not going to program everything from scratch. No, you grab an off-the-shelf framework and implement only your business-specific things in that framework. There’s so many layers of abstraction that optimization becomes impossible (beyond what the framework does for you), but it saves you a ton of expensive developer hours and gets you to market really fast. And when someone complains that your website doesn’t perform for shit, you just blame their hardware, right? Externalize those costs.

    • MonkeMischief@lemmy.today
      link
      fedilink
      arrow-up
      4
      ·
      8 days ago

      they gobble up insane amounts of RAM to provide almost the same functionality as in 2010.

      Critical to using our service? Maybe even an operating system?

      ELECTRON APP!

    • BartyDeCanter@lemmy.sdf.org
      link
      fedilink
      arrow-up
      3
      ·
      8 days ago

      4GB to run well… I remember happily running linux on 4MB of RAM, complete with X and web browser. I also remember running BeOS on a machine with 64MB of RAM and having one of the the best desktop experiences I’ve ever used.