• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: June 19th, 2023

help-circle
  • wols@lemm.eetoProgrammer Humor@lemmy.mlGot no time to code
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    5 months ago

    Bonus: good tests can also serve as technical documentation.

    Though I have to disagree with the notion that documentation is as important or more so than code.
    Documentation is certainly near the top of the list and often undervalued. I’ve worked on a project where documentation was lacking and it was painful to say the least.
    Without documentation, changing or adding features can be a nightmare. Investigating bugs and offering support is also very difficult. But without code, you have nothing. No product, no users, no value.

    There are (inferior) substitutes for documentation: specialized team knowledge, general technical expertise. These alternative pools of knowledge can be leveraged to create and improve documentation incrementally.
    There’s no replacement for the actual functionality of your applications.


  • TLDR:
    Nature can’t simply select out consciousness because it emerges from hardware that is useful in other ways. The brain doesn’t waste energy on consciousness, it uses energy for computation, which is useful in a myriad ways.

    The usefulness of consciousness from an evolutionary fitness perspective is a tricky question to answer in general terms. An easy intuition might be to look at the utility of pain for the survival of an individual.

    I personally think that, ultimately, consciousness is a byproduct of a complex brain. The evolutionary advantage is mainly given by other features enabled by said complexity (generally more sophisticated and adaptable behavior, social interactions, memory, communication, intentional environment manipulation, etc.) and consciousness basically gets a free ride on that already-useful brain.
    Species with more complex brains have an easier time adapting to changes in their environment because their brains allow them to change their behavior much faster than random genetic mutations would. This opens up many new ecological niches that simpler organisms wouldn’t be able to fill.

    I don’t think nature selects out waste. As long as a species is able to proliferate its genes, it can be as wasteful as it “wants”. It only has to be fit enough, not as fit as possible. E.g. if there’s enough energy available to sustain a complex brain, there’s no pressure to make it more economical by simplifying its function. (And there are many pressures that can be reacted to without mutation when you have a complex brain, so I would guess that, on the whole, evolution in the direction of simpler brains requires stronger pressures than other adaptations)


  • I want to preface this with the mention that understanding other people’s code and being able to modify it in a way that gets it to do what you want is a big part of real world coding and not a small feat.
    The rest of my comment may come across as “you’re learning wrong”. It is meant to. I don’t know how you’ve been learning and I have no proof that doing it differently will help, but I’m optimistic that it can. The main takeaway is this: be patient with yourself. Solving problems and building things is hard. It’s ok to progress slowly. Don’t try to skip ahead, especially early on.
    (also this comment isn’t directed at you specifically, but at anyone who shares your frustration)

    I was gonna write an entire rant opposing the meme, but thought better of it as it seems most people here agree with me.
    BUT I think that once you’ve got some basics down, there really is no better way to improve than to do. The key is to start at the appropriate level of complexity for your level of experience.
    Obviously I don’t know what that is for you specifically, but I think in general it’s a good idea to start simple. Don’t try to engineer an entire application as your first programming activity.

    Find an easy (and simple! as in - a single function with well defined inputs and outputs and no side effects) problem; either think of something yourself, or pick an easy problem from an online platform like leetcode or codechef. And try to solve the problem yourself. There’s no need to get stuck for ages, but give it an honest try.
    I think a decent heuristic for determining if you have a useful problem is whether you feel like you’ve made significant progress towards a solution after an hour or two. If not, readjust and pick a different problem. There’s no point in spending days on a problem that’s not clicking for you.

    If you weren’t able to solve the problem, look at solutions. Pick one that seems most straight forward to you and try to understand it. When you think you do, give the original problem a little twist and try to solve that. While referencing the solution to the original if you need to.
    If you’re struggling with this kind of constrained problem, keep doing them. Seriously. Perhaps dial down the difficulty of the problems themselves until you can follow and understand the solutions. But keep struggling with trying to solve little problems from scratch. Because that’s the essence of programming: you want the computer to do something and you need to figure out how to achieve that.
    It’s not automatic, intuitive, inspired creation. It’s not magic. It’s a difficult and uncertain process of exploration. I’m fairly confident that for most people, coding just isn’t how their brain works, initially. And I’m also sure that for some it “clicks” much easier than for others. But fundamentally, the skill to code is like a muscle: it must be trained to be useful. You can listen to a hundred talks on the mechanics of bike riding, and be an expert on the physics. If you don’t put in the hours on the pedals, you’ll never be biking from A to B.
    I think this period at the beginning is the most challenging and frustrating, because you’re working so hard and seemingly progress so slowly. But the two are connected. You’re not breezing through because it is hard. You’re learning a new way of thinking. Everything else builds on this.

    Once you’re more comfortable with solving isolated problems like that, consider making a simple application. For example: read an input text file, replace all occurrences of one string with another string, write the resulting text to a new text file. Don’t focus on perfection or best practices at first. Simply solve the problem the way you know how. Perhaps start with hard-coded values for the replacement, then make them configurable (e.g. by passing them as arguments to your application).

    When you have a few small applications under your belt you can start to dream big. As in, start solving “real” problems. Like some automation that would help you or someone you know. Or tasks at work for a software company. Or that cool app you’ve always wanted to build. Working on real applications will give you more confidence and open the door to more learning. You’ll run into lots of problems and learn how not to do things. So many ways not to do things.

    TLDR: If it’s not clicking, you need to, as a general rule, do less learning (in the conventional sense of absorbing and integrating information) and more doing. A lot of doing.




  • This works as a general guideline, but sometimes you aren’t able to write the code in a way that truly self-documents.
    If you come back to a function after a month and need half an hour to understand it, you should probably add some comments explaining what was done and why it was done that way (in addition to considering if you should perhaps rewrite it entirely).
    If your code is going to be used by third parties, you almost always need more documentation than the raw code.

    Yes documentation can become obsolete. So constrain its use to cases where it actually adds clarity and commit to keeping it up to date with the evolving code.



  • wols@lemm.eetoProgrammer Humor@lemmy.mlIn case you forgot.
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    1 year ago

    Extra steps that guarantee you don’t accidentally treat an integer as if it were a string or an array and get a runtime exception.
    With generics, the compiler can prove that the thing you’re passing to that function is actually something the function can use.

    Really what you’re doing if you’re honest, is doing the compiler’s work: hmm inside this function I access this field on this parameter. Can I pass an argument of such and such type here? Lemme check if it has that field. Forgot to check? Or were mistaken? Runtime error! If you’re lucky, you caught it before production.

    Not to mention that types communicate intent. It’s no fun trying to figure out how to use a library that has bad/missing documentation. But it’s a hell of a lot easier if you don’t need to guess what type of arguments its functions can handle.


  • wols@lemm.eetoProgrammer Humor@lemmy.mlMy poor RAM...
    link
    fedilink
    arrow-up
    6
    arrow-down
    2
    ·
    1 year ago

    The point is that you’re not fixing the problem, you’re just masking it (and one could even argue enabling it).

    The same way adding another 4 lane highway doesn’t fix traffic long term (increasing highway throughput leads to more people leads to more cars leads to congestion all over again) simply adding more RAM is only a temporary solution.

    Developers use the excuse of people having access to more RAM as justification to produce more and more bloated software. In 5 years you’ll likely struggle even with 32GiB, because everything uses more.
    That’s not sustainable, and it’s not necessary.



  • Yup.

    Spaces? Tabs? Don’t care, works regardless.
    Copied some code from somewhere else? No problem, 9/10 times it just works. Bonus: a smart IDE will let you quick-format the entire code to whatever style you configured at the click of a button even if it was a complete mess to begin with, as long as all the curly braces are correct.

    Also, in any decent IDE you will very rarely need to actually count curly braces, it finds the pair for you, and even lets you easily navigate between them.

    The inconsistent way that whitespace is handled across applications makes interacting with code outside your own code files incredibly finicky when your language cares so much about the layout.

    There’s an argument to be made for the simplicity of python-style indentation and for its aesthetic merits, but IMO that’s outweighed by the practical inconvenience it brings.


  • You don’t need to correct something everyone already knows is an exaggeration (and I agree it doesn’t seem very socially aware to do so) but this is a political discussion on the internet, so

    1. Everyone does not know the original figure is an exaggeration, especially by how much
    2. Providing the actual information ads value to the conversation and in this context this is more important than whether the commenter comes off as smarmy or socially inept

    What if they said “Hey I know you’re being hyperbolic, but for anyone who’s interested, here’s the number estimated by experts…”?
    The only difference here is tone.
     

    I’m not sure why they only shared numbers for minke whales, as these don’t seem to be hunted anymore in Iceland in contrast to fin whales, whom the article was about.

    Global fin whale population was estimated in 2018 by IUCN to have been around 100000.
    https://www.iucnredlist.org/species/2478/50349982#population




  • That does indeed seem like the hangup in this case, and it’s on me; I should have used a less vague word or else clarify.

    To me fresh is anything that hasn’t been processed for preservation (except drying). So cheese isn’t fresh, heat treated milk/cream isn’t fresh, smoked and cooked meats aren’t fresh, pickled foods aren’t fresh, frozen foods aren’t fresh and anything with actual preservatives added is definitely not fresh.
    “raw” would probably have been the better word to use.
    Also, having thought about my own understanding of the word a bit more in depth, I’ll concede that some pickled veggies are pretty healthy, as well as yoghurt.

    You were right with all three examples.



  • Actually fruits are pretty great for us, if they aren’t highly processed.
    Better to eat an apple than drink apple juice, also better to eat an apple than just about anything from the supermarket that isn’t fresh.
    Of course, you still need a balanced diet, and you can’t get nearly all the necessary nutrients from just apples. Still, assuming an otherwise nutrient-complete diet, it’s a lot less healthy to eat a slice of frozen pizza than an apple or a banana. (the apple might even contain less available sugar than the pizza slice - people often overestimate how much sugar fruits really contain)

    The “stuff removed” bit is more important than you seem to give it credit for. Take out all the fiber and water and sure it’s still the same sugars that are left over, but we didn’t evolve to consume large quantities of pure sugar, so it spikes our insulin and gets stored as excess fat.

    Fruit juice is pretty unhealthy, because all the sugar is more available due to all the fiber being stripped out and you can consume a dozen apples’ worth in a few minutes, which you wouldn’t do with actual apples.

    Sure, there’s not that much fiber left in raisins either. But in the context of musli they can be combined with whole grains and nuts, so you get enough fiber back to make the sugar less quickly digested and thus more healthy.

    A third of the entire cereal mix being sugar is definitely worse than musli with raisins (which comes to about 10g of sugar per 100g), especially considering that a good portion of the rest of the mass in the case of musli is made up of fiber, proteins and healthy fats.

    Adding sugar isn’t just “another big issue”, it’s the big issue. Eating fresh fruits is a non-issue, and usually so is eating dried fruits in moderation.


  • Many of the programming languages that are regularly the butt of everyone’s jokes don’t just allow you to use them badly, they make it easy to do so, sometimes easier than using them well.
    This is not a good thing. A good language should

    • be well suited to the task at hand
    • be easy to use correctly
    • be hard to use incorrectly

    The reality is that the average software developer barely knows best practices, much less how to apply them effectively.
    This fact, combined with languages that make it easy to shoot yourself in the foot leads to lots of bad code in the wild.

    Tangentially related rant

    We should attack this problem from both directions: improve developers but also improve languages.
    Sometimes that means replacing them with new languages that are designed on top of years of knowledge that we didn’t have when these old languages were being designed.

    There seems to be a certain cynicism (especially from some more senior developers) about new languages.
    I’ve heard stuff like: every other day a new programming language is invented, it’s all just a fad, they add nothing new, all the existing languages could already do all the things the new ones can, etc.
    To me this misses the point. New languages have the advantage of years of knowledge accrued in the industry along with general technological advancements, allowing them to be safer, more ergonomic, and more efficient.
    Sure, we can also improve existing languages (and should, and do) but often times for one reason or another (backwards compatibility, implementation effort, the wider technological ecosystem, dogma, politics, etc.) old quirks and deficiencies stay.

    Even for experienced developers who know how to use their language of choice well, there can be unnecessary cognitive burden caused by poor language design. The more your language helps you automatically avoid mistakes, the more you can focus on actually developing software.

    We should embrace new languages when they lead to more good code and less bad code.