• 1 Post
  • 336 Comments
Joined 1 year ago
cake
Cake day: September 24th, 2023

help-circle
  • Totally depends what you end up working on as a programmer. If it’s web apps, you’ll be totally fine. All you need is basic arithmetic. Writing a game engine? You’ll need to know some basic to moderate matrix maths…

    If you’re doing formal verification using unbounded model checking… good fucking luck.

    On average I would say most programming tasks need very little maths. If you can add and multiply you’ll be fine. Definitely sounds like you’ll be ok.









  • The is operator is for identity, not equality. Your example is just using it weirdly in a way that most people wouldn’t do.

    The + operator is for numbers or strings, not arrays. Your example is just using it weirdly in a way that most people wouldn’t do.

    I’m not defending Javascript’s obviously terrible behaviour there. Just pointing out that Python has obviously terrible behaviours too. In both cases the solution is “don’t do that, and use static analysis to make sure you don’t do it accidentally”.

    Sometimes I meet junior developers who have only ever used javascript, and it’s like (to borrow another contentious nerd topic) like meeting someone who’s only ever played D&D talking about game design.

    Yeah I think you can generalise that to “have only ever used one language”. I would say Python and Javascript are pretty close on the “noob level”. By which I mean if you meet someone who has only ever written C++, Java, or Rust or whatever they’re going to be a class above someone who has only ever written Python or Javascript.


  • Why would you use the is operator like that?

    Why would you add two arrays like that?

    Do you not use containers when you deploy

    No because I am not using Python to make a web app. That’s not the only thing people write you know…

    JavaScript is so bad you’ve resorted to using a whole other language: Typescript

    Well yeah. Typescript isn’t really a new language. It’s just type annotations for JavaScript (except for enums; long story). But yes JavaScript is pretty bad without Typescript.

    But Typescript isn’t a cop-out like Docker is.

    But the language it’s built on top of it is extremely warty. Maybe we agree on that.

    Yeah definitely. You need to ban the warts but Typescript & ESLint do a pretty good job of that.

    I mean I would still much rather write Dart or Rust but if I had to pick between Typescript and Python there’s absolutely no way I’d pick Python (unless it was for AI).





  • I dunno if you’re being deliberately obtuse, but just in case you really did miss his point: the fact that type hints are optional (and not especially popular) means many libraries don’t have them. It’s much more painful to use a library without type hints because you lose all of their many benefits.

    This obviously isn’t a problem in languages that require static types (Go, Rust, Java, etc…) and it isn’t a problem with Typescript because static types are far more popular in JavaScript/Typescript land so it’s fairly rare to run into a library that doesn’t have them.

    And yeah you can just not use the library at all but that’s just ignoring the problem.


  • A sane language, you say.

    Yes:

    Operator '+' cannot be applied to types 'number[]' and 'number[]'.
    

    We’re talking about Typescript here. Also I did say that it has some big warts, but you can mostly avoid them with ESLint (and Typescript of course).

    Let’s not pretend Python doesn’t have similar warts:

    >>> x = -5
    >>> y = -5
    >>> x is y
    True
    >>> x = -6
    >>> y = -6
    >>> x is y
    False
    >>> x = -6; y = -6; x is y
    True
    
    >>> isinstance(False, int)
    True
    
    >>> [f() for f in [lambda: i for i in range(10)]]
    [9, 9, 9, 9, 9, 9, 9, 9, 9, 9]
    

    There’s a whole very long list here. Don’t get be wrong, Python does a decent job of not being crazy. But so does Typescript+ESLint.

    I’ve worked professionally in python for several years and I don’t think it’s ever caused a serious problem. Everything’s in docker so you don’t even use venv.

    “It’s so bad I have resorted to using Docker whenever I use Python.”




  • Typescript is far nicer than Python though. Well I will give Python one point: arbitrary precision integers was absolutely the right decision. Dealing with u64s in Typescript is a right pain.

    But apart from that it’s difficult to see a single point on which Python is clearly better than Typescript:

    • Static typing. Pyright is great but it’s entirely optional and rarely used. Typescript obviously wins here.
    • Tooling. Deno is fantastic but even if we regress to Node/NPM it’s still a million miles better than the absolute dog shit pile of vomit that is Pip & venv. Sorry Python but admit your flaws. uv is a shining beacon of light here but I have little hope that the upstream Python devs will recognise that they need to immediately ditch pip in favour of officially endorsing uv. No. They’ll keep it on the sidelines until the uv devs run out of hope and money and give up.
    • Performance. Well I don’t need to say more.
    • Language sanity. They’re pretty on par here I think - both so-so. JavaScript has big warts (the whole prototype system was clearly a dumb idea) but you can easily avoid them, especially with ESLint. But Python has equally but warts that Pylint will tell you about, e.g. having to tediously specify the encoding for every file access.
    • Libraries & ecosystem. Again I would say there’s no much in it. You’d obviously be insane to use Python for anything web related (unless it’s for Django which is admittedly decent). On the other hand Python clearly dominates in AI, at least if you don’t care about actually deploying anything.

  • They seem exactly the same to me: when a variable is assigned a value, it’s equal to that value now.

    Yeah it’s confusing because in maths they are the same and use the same symbol but they are 100% not the same in programming, yet they confusingly used the same symbol. In fact they even used the mathematical equality symbol (=) for the thing that is least like equality (i.e. assignment).

    To be fair not all languages made that mistake. There are a fair few where assignment is like

    x := 20
    

    Or

    x <- 20
    

    which is probably the most logical option because it really conveys the “store 20 in x” meaning.

    Anyway on to your actual question… They definitely aren’t the same in programming. Probably the simplest way to think of it is that assignment is a command: make these things equal! and equality is a question: are these things equal?

    So for example equality will never mutate it’s arguments. x == y will never change x or y because you’re just asking “are they equal?”. The value of that equality expression is a bool (true or false) so you can do something like:

    a = (x == y)
    

    x == y asks if they are equal and becomes a bool with the answer, and then the = stores that answer inside a.

    In contrast = always mutates something. You can do this:

    a = 3
    a = 4
    print(a)
    

    And it will print 4. If you do this:

    a = 3
    a == 4
    print(a)
    

    It will (if the language doesn’t complain at you for this mistake) print 3 because the == doesn’t actually change a.