Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Which programming language did you choose and why?

Arbiter

Scholar
Joined
Apr 22, 2020
Messages
2,522
Location
Poland
cmake is cancer and should be killed with fire. We use meson + ninja, those are quite nice. I'm exactly the same, if git clone plus make or maybe pasting a few lines from the README into the terminal isn't enough, I have to be *extremely* interested in the project to even bother.

I recommend Bazel for C++ and polyglot projects. Reproducible builds, safe incrementality and caching, great performance, distributed builds if you need them.
 

ds

Cipher
Patron
Joined
Jul 17, 2013
Messages
1,376
Location
here
Isn't one of the main big benefits of C++ (other than it being sane compared to average C code) is that things like iterators are FAR more efficient than the C equivalents?

I'm far from a C++ expert, but my understanding is that it allows a lot of optimisation in a generic fashion that just isn't possible with C (or at least has to be rolled manually, it's not part of the library).

C++ gets a lot of (well deserved) hate because there's so much cruft associated with the language, especially stuff from old versions that has to remain "compatible", but arguing the language is slower than C is just plain silly. It's generally the same speed and is provably faster in a significant number of use cases.
Yes, one example is qsort in C which always needs an indirect function call for each comparison vs. std::sort in C++ which can inline the comparator. C also doesn't have standard containers (and even libraries have cumbersome or unsafe interfaces due to not having templates) so a lot of C code will roll their own ad-hoc containers using sub-par algorithms. Not that C++'s standard containers are stellar but they are at least OK for most programs. In the end, C++ is (almost) a superset of C so it always can be as fast as C. Idiomatic C++ can be slower though since the language and standard library do have subpar parts.

I know people whine about C++ and go "but muh vtable", but they are engaging in the same CPU-cycle counting that I hate. If they really hate vtables that much they can just not use inheritance since IIRC C++ doesn't generate vtables at all if you don't have any virtual functions.
Having an option to put the vtables directly into the objects would actually make sense for non-OO uses of virtual inheritance in C++ where you only have one or few instances of each interface at any time. Unlike prematurely hand-optimizing each line of code this is something that would just need to be implemented once in the compiler to save a needless indirection (one extra memory load and wasted cache line).

In general, while it's true that you should focus on optimizing the hotspots first, just needlessly throwing away performance all over the place is exactly what leads to today's software often running slower than software on much slower hardware decades ago. Even if the loss is tiny each time, it does add up.

Ninja isn't that much better. Neither is Ant or Jenkins or any of that junk. It's all symptomatic of the same problem, complicated and badly designed codebases.

Ninja is far better at hiding that complexity and simplifying it than cmake is (cmake somehow manages to make complex build situations even MORE complicated), but it's still dealing with a lot of complexity, especially if something goes wrong.
What the hell are you talking about, CMake and Ninja are not alternatives but complements. CMake / meson / automake are all Makefile generators. Ninja is a make alternative that cuts all the fat that is not needed when you use a Makefile generator anyway and so don't need any logic in the build part. CMake can generate definitions for Ninja instead of Make just fine.

Instead, you should focus on making your codebase so simple and manageable that someone can compile it in a pinch without a build tool (even if it gives them a suboptimal result that takes a long time, like g++ *.cpp -o out).

I would say even make is pushing it, the only reason I recommend it is because generating .o files is pretty much the minimum requirement for efficient compilation of any sufficiently large C or C++ project, and it does that brilliantly, and not much else, making it the perfect build tool.
People don't keep reinventing complex Makefile generators without reason. Using just make works if you only want to build on your own system but once you need a somewhat portable build system you need to deal with things being in different locations and other shit. pkg-config helps but isn't enough and doesn't even exist on all platforms. If your project is complex enough you probably also want optional dependencies and if you want your program to be packaged by Linux Distribution / *BSD port trees / Homebrew you also need to provide enough config switches for your code to be installed in the correct locations. Autotools, CMake and meson all tackle that and all have their own shittyness - not the least of which is CMake's inbred scripting language - but also all kinda manage to do what they set out to do. The best one to use is the one that your downstream users will already be familiar with so they don't need to learn a new set of idiosyncrasies.

I recommend Bazel for C++ and polyglot projects. Reproducible builds, safe incrementality and caching, great performance, distributed builds if you need them.
Adding a JRE dependency just for the build system is pretty meh.
 
Last edited:

Egosphere

Arcane
Joined
Jan 25, 2018
Messages
1,909
Location
Hibernia
Then task them with writing a simple program that creates a vector<int>, adds a few numbers, removes one, and loop over the rest and print them with whatever function they like.
I'd bet they'll get as far as defining it, but fail to figure out how to do almost anything with it.
That's how programming works. You figure out how to do something with some language and then you do it. STL has a logic and it's actually quite easy to understand if you spend some time learning it. What I find funny is that people are never happy about anything, there is always something you could do to "improve" a language. We even have these improved languages like Rust, Go, Python etc. already, so it's difficult to understand why people don't use them. I mean some do, but you don't hear them talking about how much easier it is to do something, because it's not about the language, it's the game/program/whatever you are trying to create.

STL takes forever to compile, is very complicated under the hood so it makes debugging difficult, and was not designed for games, so it doesn't have the reliability, and speed that really good programmers can get. Also C++ templates are the cause of this meme. If you don't use templates, and use unity build, a program of any size will compile in less than 10 seconds, instead of hours, and days in some cases.

compiling.png


"We even have these improved languages like Rust, Go, Python etc."

I don't use Rust, but from the comments I have seen, from game programmers who gave it a good try, it is not productive to use, and they don't need its memory safety features, because they aren't junior programmers. Python is a high level language that is good for prototyping. You can get a lot done fast thanks to its libraries, but the programs run very slow, which is bad for the user experience. I tried learning Go and liked its style, then I remembered it uses garbage collection, and stopped learning it. Garbage collection is not useful for games, because players will notice your program stuttering, when the garbage collector randomly decides to release unused memory.

IMO all people ever really needed was C. They just needed to make better libraries for it, and make it easier to compile, so it is more accessible to new users. Hopefully an AI will be created that can sort out the mess.

Edit:
If any9ne wants to learn the simple way of C programming, watch Casey Muratori's first 20 or 30 youtube videos to get the main ideas: unity build, using bat/shell files to build programs, and his style of programming. I think he calls it compression oriented programming. Basically, he doesn't design everything upfront, he lets the program design itself. So he solves problems one at a time, and if he finds himself solving the same problem several times, only then does he make a general solution. Also, dig around the Handmade Hero forums, there's some good info there.

This is also very helpful.
https://www.gingerbill.org/series/memory-allocation-strategies/
Does anyone know the Casey Muratori videos mentioned? There's a series of vids on youtube, but it's only a 5 part series with an addition Q&A vid on top, so 10 vids in total.
 
Joined
Dec 17, 2013
Messages
5,183
"Python is a high level language that is good for prototyping."

Yes!

"You can get a lot done fast thanks to its libraries, but the programs run very slow, which is bad for the user experience."

This is wrong. Python is not slow, it is slower (than say C++, Java, C#, etc). In the same way that a Honda is slower than a Lamborghini. But that doesn't make Honda a worse car, since in vast majority of real life situations, you will be driving under 70mph anyway. In those scenarios, Honda's price, reliability, ease of servicing will make it a far preferable option to a Lamborghini.

Same with Python. While it is significantly slower in absolute terms than many other languages due to it being interpreted, dynamically typed, etc, in real life, this has very limited impact. Modern hardware is extremely powerful, and "slow" in programming language terms means a tiny observable difference in many situation. Obviously speed is not a factor at all in areas like automation, automated testing, back-end processes, etc.

But even in areas with people using software, Python's lack of speed will not be noticeable enough to matter in many cases. As proof, just check out how many websites with MASSIVE volume of visits use or have used Python in the past, websites like Instagram, Reddit, Pinterest, Uber, DropBox, Google, YouTube, etc. The general pattern with these is to use Python at first, and then as the website starts getting billions of visits a day, they do generally switch to something faster, but up until that point, we are still talking about Python easily handling massive traffic. Or look at Godot, the game engine. They developed their own language, which is basically Python but with support for multi-threading. So technically you can even develop games in Python effectively with a few slight tweaks to the language. Of course in general, games are one of the few areas where I wouldn't necessarily choose Python, but I am just using this as an example for people who think it's so deadly slow.

What novice programmers often don't understand is that the speed of the language is likely the least important factor in the speed of the application. Far more important factors are things like bad code, network bandwidth, etc.

If you ever visit a really slow website, or use a really slow app, in almost every case this is because the code in that website/app is doing really dumb shit, like nested loops through large data sets, or cascading through the database and collecting all related data without the need for it, or some other stupid stuff like this. Or loading massive javascript and resource files for every webpage on a slow connection.
 
Joined
Jan 5, 2021
Messages
414
People don't keep reinventing complex Makefile generators without reason. Using just make works if you only want to build on your own system but once you need a somewhat portable build system you need to deal with things being in different locations and other shit. pkg-config helps but isn't enough and doesn't even exist on all platforms. If your project is complex enough you probably also want optional dependencies and if you want your program to be packaged by Linux Distribution / *BSD port trees / Homebrew you also need to provide enough config switches for your code to be installed in the correct locations. Autotools, CMake and meson all tackle that and all have their own shittyness - not the least of which is CMake's inbred scripting language - but also all kinda manage to do what they set out to do. The best one to use is the one that your downstream users will already be familiar with so they don't need to learn a new set of idiosyncrasies.

I'm yet to see a project of any size where the build can't be configured with the right combination of environment variables and command line switches.

Worst case, you can write a simple shell script to handle most build cases.

My problem isn't that these systems are useless, it's that they do a relatively simple job in the most complicated, bloated, and difficult to use way possible. All of them cause no end of problems when they just simply don't work for whatever reason, and the shittiness gets in the way of what is essentially a simple process.

Every single time I have replaced one of these with something custom, the custom thing has always been simpler and easier for everyone involved.

If you need a build system that complicated, you're not coding correctly. It's that simple.

Also, unrelated, the problem with Python isn't it's speed. It's the fact that it's weakly typed. All weakly typed languages are inherently flawed to the point of being unusable for real projects, because there's a 100% guarantee that you will make a mistake that's hard if not impossible to debug when some assignment or expression causes an unwanted conversion. Even worse, when the new conversion is mostly valid, this bad conversion can corrupt all the data in your codebase and cause major issues elsewhere that will look like unrelated errors. Enough C programmers have weird issues with integer vs float divides, it's even worse in weakly typed languages. Python also has really awful syntax which makes it especially unusable.
 
Last edited:
Joined
Dec 17, 2013
Messages
5,183

Also, unrelated, the problem with Python isn't it's speed. It's the fact that it's weakly typed. All weakly typed languages are inherently flawed to the point of being unusable for real projects,

This is nonsense. I already gave a ton of examples of the GREATEST real projects all using Python (Instagram, Reddit, Uber, YouTube, Google, Pinterest), but yeah, I am sure Python can't accommodate your 10 user cat-phishing app.

The whole dynamic typing thing is overblown. It IS an issue in the sense that it makes debugging/maintenance more complicated, but real world is all about trade-offs. No language is perfect, and while this is a relative flaw of Python and languages like it, to counter-balance it, Python brings a TON of advantages to development.

As far as dealing with it, one, you can use annotations to type your variables in Python, which is almost as good as real strong typing, if you absolutely need it. Two, it's only a major issue if you work with mediocre/nooblet programmers. Quality programmers write well structured code with descriptive variable names, where it's not that fucking hard to figure out the variable type. It will take a little bit more time than in a strongly typed languages, but you will save a lot more time from other positive features of Python.

because there's a 100% guarantee that you will make a mistake that's hard if not impossible to debug when some assignment or expression causes an unwanted conversion. Even worse, when the new conversion is mostly valid, this bad conversion can corrupt all the data in your codebase and cause major issues elsewhere that will look like unrelated errors. Enough C programmers have weird issues with integer vs float divides, it's even worse in weakly typed languages. Python also has really awful syntax which makes it especially unusable.

This proves my point above. If you have shitty programmers, yes, this can become a problem. But then in this case you will have a lot of problems anyway. Otherwise, it's really not that difficult to work with the dynamic freedom Python gives you.
 

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,472
Location
down under
Codex+ Now Streaming!

Also, unrelated, the problem with Python isn't it's speed. It's the fact that it's weakly typed. All weakly typed languages are inherently flawed to the point of being unusable for real projects,

This is nonsense. I already gave a ton of examples of the GREATEST real projects all using Python (Instagram, Reddit, Uber, YouTube, Google, Pinterest), but yeah, I am sure Python can't accommodate your 10 user cat-phishing app.

The whole dynamic typing thing is overblown. It IS an issue in the sense that it makes debugging/maintenance more complicated, but real world is all about trade-offs. No language is perfect, and while this is a relative flaw of Python and languages like it, to counter-balance it, Python brings a TON of advantages to development.

As far as dealing with it, one, you can use annotations to type your variables in Python, which is almost as good as real strong typing, if you absolutely need it. Two, it's only a major issue if you work with mediocre/nooblet programmers. Quality programmers write well structured code with descriptive variable names, where it's not that fucking hard to figure out the variable type. It will take a little bit more time than in a strongly typed languages, but you will save a lot more time from other positive features of Python.

because there's a 100% guarantee that you will make a mistake that's hard if not impossible to debug when some assignment or expression causes an unwanted conversion. Even worse, when the new conversion is mostly valid, this bad conversion can corrupt all the data in your codebase and cause major issues elsewhere that will look like unrelated errors. Enough C programmers have weird issues with integer vs float divides, it's even worse in weakly typed languages. Python also has really awful syntax which makes it especially unusable.

This proves my point above. If you have shitty programmers, yes, this can become a problem. But then in this case you will have a lot of problems anyway. Otherwise, it's really not that difficult to work with the dynamic freedom Python gives you.
If you see beyond the superlatives and the trolling in the above post, Porky is actually touching on something important that I've been contemplating for a while.

I'm very fond of strict typing because that makes it possible to perform large-scale refactors in a secure manner... Or so I thought. But then, even in a typed language, just because your stuff still compiles after the refactorings, there are actually zero guarantees that it still does the correct thing. So you must have sufficient automated test coverage, or at the very least you must test manually to make sure your code is not completely broken, even though it compiles just fine. Which brings the *drawbacks* of strict typing into question; namely that often you need to do a lot of busywork just to satisfy the compiler, and at the end of the day, you still need to test your stuff, you can't just go "it compiles -- ship it! woohooo!". Compiler errors and warnings only prevent a small subset of possible problems from happening, but in no way can a compiler guarantee *correctness*. No language can do that, I don't even think Ada can 100% of the time (but I have no actual working experience with it).

So yeah, I'm not going to throw all my typed languages away at this very moment, but I've been questioning for a while now whether the safety the compiler gives you is actually a false sense of safety that can be actually harmful (if you don't do sufficient levels of testing), and you do need to "shoehorn" your thoughts into the confines of what is expressable in a given statically typed language, which subtly (or not so subtly) guides your thinking down certain path. You can only think what you have concepts for, and in a way statically typed languages restrict the set of "potentially thinkable" concepts to maybe a "most useful subset of everything that's thinkable", but it's still a restriction imposed by the *designer* of the compiler to your thinking! Think about that!

A good example of this is homogeneous lists. Yes, often you can design your app around lists having the exact same base type, but often that introduces varying amounts of OOP cruft. Yes, you can get around it by putting root-level objects in your list (Any, Object, void *, or whatever it's called in any given statically typed language), but then you need to do a lot of casting, and that effectively negates the benefits of using a typed language in the first place plus adds a lot of noise. If your problem domain somehow consists of very fluidly defined data that defies strict categorisation, I'd say using a typed language where you must use Any/Object/very generic maps/hashes/etc. to represent such data is just a bad choice.

Interestingly, this is one of the main reasons why Rick Hickey designed Clojure to have a dynamic type system. The types of problems he tends to work on just defy the strict categorisation enforced by strict type systems, in which case strict typing becomes a hindrance rather than help. I've also read accounts of LISP devs who claimed refactoring was never a problem in hugely large-scale LISP programs back in the day, despite the ultimate freedom of LISP and dynamic typing (think of 1000+ man projects with millions of lines of code, not your average small toy project...)

So like I said, I'm not exactly a convert yet, but I'd be interested in doing some real-world non-toy projects in dynamic languages so I can compare the actual dev/maintenance experience myself.

You know, the difference between theory and practice is that in theory there is no difference...
 
Last edited:

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,472
Location
down under
Codex+ Now Streaming!
Also, I hate `int` promotion in C/C++ with a passion. That inconsistent, arbitrary, compiler/platform dependent misfeature should be killed with fire. But it won't be; it's legacy cruft that maybe made some sense on old architectures where all you had was bytes and 16-bit words, or maybe 32-bit dwords, and you just can't break compatibility with those metric tonnes of legacy code out there that run the whole world...

Also... real-world cross-platform C/C++ programming, 2023 edition

LhpSPOX.png
 

ds

Cipher
Patron
Joined
Jul 17, 2013
Messages
1,376
Location
here
I don't think "static typing doesn't catch literally every bug" is that great of an argument against static typing. How much you catch also depends how specific your types are - e.g. do you have a generic time type or separate types for different time domains (e.g. real time and simulation time) without implicit conversions. I also don't really find the constraints that much of a problem in practice - how often do you really need a list of whatever.

Also... real-world cross-platform C/C++ programming, 2023 edition

LhpSPOX.png
This doesn't really need to bloat the expressions with nonsense like "#ifdef defined" or random parentheses to make the point. Also should only apply to a relatively small subset of your code unless you are writing a platform abstraction library.
 
Joined
Jan 5, 2021
Messages
414
Quality programmers write well structured code with descriptive variable names, where it's not that fucking hard to figure out the variable type. It will take a little bit more time than in a strongly typed languages

Way to miss the point entirely. The problem is not that I can't figure out the types at declaration time, the problem is the variables will change at runtime in mysterious, difficult to debug, and problematic ways, and are very error-prone as a result.

but you will save a lot more time from other positive features of Python.

As you say, business is all about tradeoffs. Develop in python and you will get rapid development. You'll also be stuck with python. Want to have speed at scale? Too bad, you're stuck with python. Need to interact with memory directly? Too bad, you're stuck with python. Need to write testable code that works reliably with any input? Too bad, you're stuck with python (or any other weakly typed language).

Python has it's place. It's excellent for build scripts and little tools for doing things like automating keystrokes in games. But leave the real programming to the grown ups.

This proves my point above. If you have shitty programmers, yes, this can become a problem. But then in this case you will have a lot of problems anyway. Otherwise, it's really not that difficult to work with the dynamic freedom Python gives you.

Have you never in your entire life written code containing a mistake?

All it takes is a one second lapse of concentration and you have an impossible to debug error which the compiler won't warn you about, because it's a "feature".

I'm very fond of strict typing because that makes it possible to perform large-scale refactors in a secure manner... Or so I thought. But then, even in a typed language, just because your stuff still compiles after the refactorings, there are actually zero guarantees that it still does the correct thing. So you must have sufficient automated test coverage, or at the very least you must test manually to make sure your code is not completely broken, even though it compiles just fine. Which brings the *drawbacks* of strict typing into question; namely that often you need to do a lot of busywork just to satisfy the compiler, and at the end of the day, you still need to test your stuff, you can't just go "it compiles -- ship it! woohooo!". Compiler errors and warnings only prevent a small subset of possible problems from happening, but in no way can a compiler guarantee *correctness*. No language can do that, I don't even think Ada can 100% of the time (but I have no actual working experience with it).

The point of strong typing is not to guarantee logical correctness, it's to guarantee consistency. If I have a function taking 2 int values and it adds them together and returns the result, I know that I can pass in any 2 values and will always be guaranteed that the function works the same way, will not throw a runtime error, and will generally behave consistently based on the inputs. Obviously there are caveats - we have to account for nulls and invalid states in many functions (although things like the null object pattern can help with this), sometimes we have to catch possible exceptions, etc, but generally functions can have consistent results, especially in languages like C++ that can provide a no-throw guarantee.

With a weakly typed language, I have no guarantees. If I pass in 2 strings (or a string and an int) it will concat them, if I pass in 2 objects it will do whatever their + operator tells them to do, or throw a runtime error if the type has no + operator defined.

Weak typing gives me no control over what is passed to the function and no guarantee that what is passed will be of a valid type for the comparison.

With strong typing, I can guarantee at COMPILE TIME that I am, at the very least, calling this function in a way that will generate a valid answer. I cannot guarantee that the answers I get back will be utilised correctly, or that I will pass in the correct values, but as a function, I can guarantee that it works. With tests, this can become even better because I am free to refactor without fear of change to the requirements of the resulting function.

I have no such guarantee with weak typing. At any time someone can throw something at my function that may generate a runtime error because it's an invalid type (which means everything needs to be wrapped in exception handling), or they could throw a valid type but not the intended type and turn my Sum function into a Concat function. Worse, we now have undefined behaviour - if bob's code was fetching numbers from a table and passing them to my Sum function, everything will work correctly, until some idiot adds "three" to the table - now when bob calls the function, we have undefined results (likely a concatenation), no error, and invalid data. All because we wanted to use a "nice and easy" weakly-typed language. Data can inform the functionality of my code - that is INSANE! How can I guarantee ANYTHING when my fundamental program logic can be modified by the data passed in by external code!

You are correct in that the compiler won't save me if I write bad code. But it's critically important that the compiler is able to detect these sorts of typing errors so that they don't become runtime errors. Compiler errors will be found by me and fixed before I ship (obviously), but runtime errors are going to be found by my users. As a developer you should ALWAYS prefer compile errors over runtime errors. Doing all that "busywork to appease the compiler" isn't just some unnecessary and annoying boiler plate, it's fundamentally protecting your code from bad inputs and ensuring you're not doing something monumentally stupid.

This is especially important when using inheritance/polymorphism. Polymorphic code is extremely safe in strongly-typed OO languages because of how much the compiler takes care of. I have to know ahead of time which functions I can call and which variables I can query/set, but that's a benefit. In non-polymorphic strongly-typed languages (like C) and weakly typed languages (like all the garbage the zoomers use), polymorphism is an absolute crapshoot, since the compiler has absolutely no idea if a function call is valid or not and can give you no help, especially since weakly typed languages don't really have a concept of polymorphism (and how could they - base types don't exist because types don't exist), so usually you have to resort to passing objects around, calling functions by name, and hoping everything works. And you can't test it because just like with my Sum function there's no way to guarantee someone isn't passing something invalid into the function to cause a runtime error (usually some sort of invalid method error). The worst you can do in a strongly typed language is pass a null reference (and in languages like C#, this is being slowly fixed thanks to nullable support)

Of course this all goes out the window if you throw. My Sum function can return an int, but if it can also throw a RuntimeException well, there goes my type safety. This is why, when using strongly-typed languages, I highly recommend not throwing in any of your core business code, only throw for wacky shenanigans (like database failures), and handle them nearby where they are raised.

Some weakly typed languages (eg PHP) allow you to use "type hints" for things like function arguments, but these barely work, provide no protection, and are a bandaid on top of the problem. PHP has no valid way to handle polymorphism other than "run it and hope for the best", and neither does python from what I recall.

Which language is ran by 3 billion devices? Checkmate, atheists.

Python virgins DESTROYED by Java chads

Also, I hate `int` promotion in C/C++ with a passion. That inconsistent, arbitrary, compiler/platform dependent misfeature should be killed with fire. But it won't be; it's legacy cruft that maybe made some sense on old architectures where all you had was bytes and 16-bit words, or maybe 32-bit dwords, and you just can't break compatibility with those metric tonnes of legacy code out there that run the whole world...

Yes. The other misfeature of C/C++ that I hate is how "dynamic" the / operator is. Oh, you were expecting a proper divide? Jokes on you, have an integer divide. Enjoy your divide by 0 error, bitch!
 
Last edited:

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,472
Location
down under
Codex+ Now Streaming!
The point of strong typing is not to guarantee logical correctness, it's to guarantee consistency.
I know what strong typing is. I'm coming from an assembly background; I learned and used x86 asm years before I even learned C :)

Technically what you write is true, but I think you missed the points I raised about the potential benefits of dynamic typing. Like I said, I lack experience with large-scale projects written in dynamic languages, but I'd be interested in giving them a go with an open mind and experiencing their particular tradeoffs in a real-world scenario. I've been using statically typed languages professionally for 20+ years and I'm increasingly annoyed by their limitations, so yeah, I'm curious.

In the real world, every language is about tradeoffs; it's not as if static typing is the ultimate solution to everything or something.

So, I'm kinda sitting on the fence. Which has the benefit of seeing both camps from a distance.
 
Joined
Dec 17, 2013
Messages
5,183
Quality programmers write well structured code with descriptive variable names, where it's not that fucking hard to figure out the variable type. It will take a little bit more time than in a strongly typed languages

Way to miss the point entirely. The problem is not that I can't figure out the types at declaration time, the problem is the variables will change at runtime in mysterious, difficult to debug, and problematic ways, and are very error-prone as a result.

This only happens if you write bad code. When you walk in real life, you have the freedom to jump out of windows, run across highways, walk into lampposts. But most people don't. They just walk one foot in front of the other.

Just because Python gives you the freedom to have dynamic variables doesn't mean you have to always use it. It's meant for very specific cases where variable types change in terms of business logic, but in most cases, you can program Python as if it was a strongly typed language, by sticking to the single logical variable type, which will prevent all these bugs you are talking about.

If you insist on doing dumb stuff, you are just as likely to do it in a strongly typed language.

but you will save a lot more time from other positive features of Python.

As you say, business is all about tradeoffs. Develop in python and you will get rapid development. You'll also be stuck with python. Want to have speed at scale? Too bad, you're stuck with python. Need to interact with memory directly? Too bad, you're stuck with python. Need to write testable code that works reliably with any input? Too bad, you're stuck with python (or any other weakly typed language).

Nonsense again. You are not stuck with Python anymore than with any other language, in fact Python has all sorts of built in tool like Cython where you can code high-performance parts of the program in C and it will integrate well with the rest of the Python program.

Python has it's place. It's excellent for build scripts and little tools for doing things like automating keystrokes in games. But leave the real programming to the grown ups.

Yeah ok, Instagram, YouTube, Google, Uber, Reddit, and the entire AI industry have called and asked to laugh at you.


This proves my point above. If you have shitty programmers, yes, this can become a problem. But then in this case you will have a lot of problems anyway. Otherwise, it's really not that difficult to work with the dynamic freedom Python gives you.

Have you never in your entire life written code containing a mistake?

All it takes is a one second lapse of concentration and you have an impossible to debug error which the compiler won't warn you about, because it's a "feature".

Read Rincewind 's post above. Everybody makes mistakes while programming, but good programmers don't make dumb mistakes, like passing around changing variable types in spaghetti code like candy. Well structured Python code is not that different from any other language in terms of potential bugs. If you insist on doing stupid things, you will find ways to do them in any paradigm.

Also, only shitty programmers rely on debuggers/compilers to catch their errors.

 
Joined
Jan 5, 2021
Messages
414
This only happens if you write bad code. When you walk in real life, you have the freedom to jump out of windows, run across highways, walk into lampposts. But most people don't. They just walk one foot in front of the other.

Just because Python gives you the freedom to have dynamic variables doesn't mean you have to always use it. It's meant for very specific cases where variable types change in terms of business logic, but in most cases, you can program Python as if it was a strongly typed language, by sticking to the single logical variable type, which will prevent all these bugs you are talking about.

If you insist on doing dumb stuff, you are just as likely to do it in a strongly typed language.

I don't think you know what "strongly typed" means

Nonsense again. You are not stuck with Python anymore than with any other language, in fact Python has all sorts of built in tool like Cython where you can code high-performance parts of the program in C and it will integrate well with the rest of the Python program.

Saying you're not stuck with Python because C bindings exist is like saying you're not stuck with SQL because it's compatible with multiple database types.

If your business logic is written in python, whether you use python or cython, you're stuck with python. Sure, you can coax some extra speed out of it, but all the other problems remain.

Yeah ok, Instagram, YouTube, Google, Uber, Reddit, and the entire AI industry have called and asked to laugh at you.

Every single one of these sites is an overbloated, barely functional mess. I know you're talking about back end, but if their front-end is this bad I don't expect their back end to be much better.

When entire websites (like Parler) can be taken offline because their back ends were so intertwined with AWS garbage, maybe the "web industry" is not a good example to use in favour of your language - literally no web developer knows what they are doing.

I hesitate to call web developers - front or back end - programmers.

But sure, you can take the win here. Python is definitely a popular language, because it's easy to use. That doesn't equate to good, though. Javascript is extremely popular (and weakly typed!), and I don't think you'll find a single person stupid enough to say it's a good language.

Read Rincewind 's post above. Everybody makes mistakes while programming, but good programmers don't make dumb mistakes, like passing around changing variable types in spaghetti code like candy. Well structured Python code is not that different from any other language in terms of potential bugs. If you insist on doing stupid things, you will find ways to do them in any paradigm.

I already gave an example above of how your program can be well-written, but some third party data can corrupt your variables by changing their type.

You're technically correct in that perfectly-written code is not error prone. But we are all fallible, and when it comes down to it, you're always better off with a language that makes it as difficult as possible to make mistakes - especially mistakes involving data corruption. Weakly typed languages make it EXTREMELY EASY for even minor mistakes to have dire consequences.

Stop basing your argument around "well good code doesn't have issues!". Nobody is disputing that you can write code in weakly typed languages which mitigates these issues. In the same vein, it's possible to navigate a minefield without a detector and survive. It's also really stupid.


Also, only shitty programmers rely on debuggers/compilers to catch their errors.

This is just so fundamentally, insanely wrong I don't even know where to start. Any sane, competent programmer will do everything in their power to ensure their code doesn't ship with bugs. This includes (but is not limited to) using a language that won't fuck them at the first opportunity.
 
Joined
Dec 17, 2013
Messages
5,183
This only happens if you write bad code. When you walk in real life, you have the freedom to jump out of windows, run across highways, walk into lampposts. But most people don't. They just walk one foot in front of the other.

Just because Python gives you the freedom to have dynamic variables doesn't mean you have to always use it. It's meant for very specific cases where variable types change in terms of business logic, but in most cases, you can program Python as if it was a strongly typed language, by sticking to the single logical variable type, which will prevent all these bugs you are talking about.

If you insist on doing dumb stuff, you are just as likely to do it in a strongly typed language.

I don't think you know what "strongly typed" means

You are using it as too much of a crutch. Good programmers can make a weakly typed language behave similarly to a strongly typed one, while bad ones will just make mistakes in a different way.

Nonsense again. You are not stuck with Python anymore than with any other language, in fact Python has all sorts of built in tool like Cython where you can code high-performance parts of the program in C and it will integrate well with the rest of the Python program.

Saying you're not stuck with Python because C bindings exist is like saying you're not stuck with SQL because it's compatible with multiple database types.

If your business logic is written in python, whether you use python or cython, you're stuck with python. Sure, you can coax some extra speed out of it, but all the other problems remain.

I don't even understand what your argument here is, tbh. If you use a language, you are stuck with it. Ok...

Yeah ok, Instagram, YouTube, Google, Uber, Reddit, and the entire AI industry have called and asked to laugh at you.

Every single one of these sites is an overbloated, barely functional mess. I know you're talking about back end, but if their front-end is this bad I don't expect their back end to be much better.

When entire websites (like Parler) can be taken offline because their back ends were so intertwined with AWS garbage, maybe the "web industry" is not a good example to use in favour of your language - literally no web developer knows what they are doing.

I hesitate to call web developers - front or back end - programmers.

But sure, you can take the win here. Python is definitely a popular language, because it's easy to use. That doesn't equate to good, though. Javascript is extremely popular (and weakly typed!), and I don't think you'll find a single person stupid enough to say it's a good language.

Well, while people are literally changing the world with Python and Javascript (web, AI, robotics, automation, scientific programs), you keep thinking that whatever language you like is great while you create really fast code for your 10 user company. You do you.

Read Rincewind 's post above. Everybody makes mistakes while programming, but good programmers don't make dumb mistakes, like passing around changing variable types in spaghetti code like candy. Well structured Python code is not that different from any other language in terms of potential bugs. If you insist on doing stupid things, you will find ways to do them in any paradigm.

I already gave an example above of how your program can be well-written, but some third party data can corrupt your variables by changing their type.

How often would it happen that a third party company just randomly changes their API data types, doesn't inform anyone, and the new type doesn't break something in your code as an alert to you? That's such a fringe scenario, I am sure something similar can be found for any other paradigm. In most RL cases, stuff like this is not likely to happen, and if you are dealing with such shitty third party companies, you can easily write some data validation.

You're technically correct in that perfectly-written code is not error prone. But we are all fallible, and when it comes down to it, you're always better off with a language that makes it as difficult as possible to make mistakes - especially mistakes involving data corruption. Weakly typed languages make it EXTREMELY EASY for even minor mistakes to have dire consequences.

Stop basing your argument around "well good code doesn't have issues!". Nobody is disputing that you can write code in weakly typed languages which mitigates these issues. In the same vein, it's possible to navigate a minefield without a detector and survive. It's also really stupid.

That's the thing though, you are only looking at the negative side of Python without looking at the massive positives. It's actually all about trade-offs.

Also, only shitty programmers rely on debuggers/compilers to catch their errors.

This is just so fundamentally, insanely wrong I don't even know where to start. Any sane, competent programmer will do everything in their power to ensure their code doesn't ship with bugs. This includes (but is not limited to) using a language that won't fuck them at the first opportunity.

You are misunderstanding what I said. Good programmers don't rely on debuggers to catch their errors because they test their shit.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom