Had to work with a python programer on a small java project (in uni). I passed some (handcrafted) strings in an Optional to be explicit an first thing he does is check whether they are empty (sending on empty strings would not have been problematic). Also he had compilation errors on his branch that lasted over a week. What python does to someone.
I worked under a self-proclamed Python/JavaScript programmer, and part of the job involved doing rather advanced stuff in various other typed languages like c# and c++. It was hell. The code review were hell. For every little tiny weenie little things we had to go through “why coding c++ like it is python” is a very bad idea.
What is crazy about developers who exclusively work with scripting languages is that they have no conception of why general good practices exist, and they often will make up their own rules based on their own quirks. In my previous example, the developer in question was the author of a codebase that was in literal development hell, but he was adamant on not changing his ways. I’d definitely be wary of hiring someone who exclusively worked with scripting language, and sometime it is less work to train someone who is a blank slate rather than try to deprogram years of bad habits.
Are you referring to Python and JS as scripting languages? The two most popular languages on the planet? Ones which are capable of building almost any kind of app imaginable? Surely you don’t apply your limited experience with a single dev to a group of millions of developers doing extremely varied things, right?
Python and Js are by definition scripting languages in the classical sense. I am not using the term in a derogatory way and I myself learnt programming this way as a 90s kid. No offense but I think you took my comment way too personal.
What is the “classical” sense? What are you implying when you say they are “scripting” languages? What you are imparting to me is that they are less-than other, real languages. I don’t take personal offense, but I do take issue with the mischaracterization and implication that those languages are somehow less serious or less broadly useful.
If someone on the internet calls something a “scripting language,” it’s hard to take that in a vacuum. I’ll accept that there is overlap between “interpreted” and “scripting” languages, but they aren’t synonymous, particularly in my experience interacting with developers online. The typical discourse does indeed trivialize the so-called scripting languages, and my only intent is to say that they are a lot more than what they began as.
There are definitely people out there shitting on all sort of languages, and JS is a huge target, but those have been referred to as scripting language for as long as they existed. It stern from the fact those languages are embedded into existing applications, as opposed to being built into binaries. Nowadays you have hybrids like C# which can used as either a scripting language or to build native app (or in-betwee), so it is really just a matter of the context you’re using the language in. There is inherently no hidden meaning or elitism in the term. It is a very old term and I think you simply got the wrong impression from your internet experiences. It is how those languages are defined basically everywhere. Even some of those languages official definition from their own website self-define as scripting languages. There is no ambiguity here at all.
Most scripting languages are interpreted, not compiled. It’s not a criticism of them, but it is a tradeoff that is good to understand.
It seems like you are the one who is conflating terms like “script kiddie” with “scripting language” and adding some negative connotation that isn’t necessarily implied.
Scripting languages are usually easier to learn, have simpler syntax, and abstractions that hide complexity. These make them easier to get started in, but the downside is they are generally slower (performance-wise) than their compiled counterparts.
I’d change this slightly - the problem isn’t exclusively working in scripting languages, but dynamically typed ones. There are people who write great code in Python (with typing) and in Typescript, and they usually can work well in other languages too. But people who don’t type their programs are in my experience simply bad developers, the way you describe.
I feel like there is a fundamental difference between developers with a data-centric perspective, and a function-centric perspective.
The function-centric one is about adding functionality, and it’s what developers start out with. You have functions that do things, and if requirements change or the thing should be re-used - no problem, I can quickly add a new toggle parameter here or bolt it on over there. I’ll be done in 5 minutes, no problem!
Then, over time, you learn that functionality isn’t that interesting or difficult. Instead, the hard parts are the ones concerning the flow of data through your application. What do I know about the shape of my data in this part of my application? What can I be sure of regarding invariants over there? This forces you to build modular software without interdependencies, because - in the end - you just build a library that has small adapters to the outside world.
I like scripting languages a lot, but it’s way too easy to become “good” at that style of programming, and the better you get at it, the harder it will be to actually move forward to a data-centric perspective. It’s a local maximum that can trap people, sometimes for their whole career. That’s why I try to look at typing experience when evaluating candidates for positions.
I wanted to get back to you, because you are so very right, and I spent the last 10 years or so trying to evangelize the fact that implementing algorithm and logic isn’t the hard part, it is a trivial concern really. Everything that go wrong with development usually involve the flow of data, and figuring out how to get this data from over here to over there without making a big mess. To do that, you absolutely need to write small module with few dependencies. You gotta think about the life-cycle of your objects, and generally follow all the principles of s.o.l.i.d if you’re doing OOP. Personally, I really love using dependency injection when the project allows for it.
It is as you said really, you can have thousands of hours of programming experience but if you never tried to solve those issues you’re really limiting yourself. Some devs think designing software around your data instead of your algorithms is overthinking it, or “overengineering” as I have been told. Well, I would not hire those people for sure.
I have seen clean project made up of small modules, with clear boundaries between data, functions and the lifecycle configurations. It is night and day compared to most code bases. It is really striking just how much of the hidden, and not-so-hidden complexity and goo and hacks and big-ass functions in most code base really just exist because the application life cycle management is often non-existent. In a “proper” code base, you shouldn’t have to wonder how to fetch a dependency, or if an object is initialized and valid, and where to instantiate your module, or even what constructor to invoke to build a new object. This take care of so much useless code it is insane.
To close on this, I like scripting languages a lot as well, and you can do great things with some of them even if lot of developers don’t. JS has Typescript, ReactiveX, dependency injection framework, and etc. It is a great language with a lot of possibility, and you’re not forced into OOP which I think is great (OOP and functional programming are orthogonal solutions imo). But the reality is that the language is really easy to misuse and you can definitely learn bad traits from it. Same as you, I would be wary of a developer with no experience with strongly-typed languages, or at the very least TS. I am very happy to hear this take randomly on the internet, because in my experience, this is not how most developers operate, and imo it is demonstrably wrong to not design applications around your data.
“Easier to Ask Forgiveness than Permission” vs. “Look Before You Leap.”
In other words, in Python you should just write the code to do the thing and then put an exception handler at the bottom instead of cluttering up your function with guard code everywhere.
I grew up with C and C/++ is still my main language, checking for empty strings is instinctive to me. It’s cheap insurance and what does it cost, a couple cycles?
Though you won’t find me using bare cstrings these days unless there is a damn good reason for it. So much extra work to handle them. Even in embedded work, String classes have superceded them.
Ruling Javascript and Python programmers out would be more sane imho. Java sucks, but at least its typed and doesn’t implement weird semantics.
Had to work with a python programer on a small java project (in uni). I passed some (handcrafted) strings in an Optional to be explicit an first thing he does is check whether they are empty (sending on empty strings would not have been problematic). Also he had compilation errors on his branch that lasted over a week. What python does to someone.
I worked under a self-proclamed Python/JavaScript programmer, and part of the job involved doing rather advanced stuff in various other typed languages like c# and c++. It was hell. The code review were hell. For every little tiny weenie little things we had to go through “why coding c++ like it is python” is a very bad idea.
What is crazy about developers who exclusively work with scripting languages is that they have no conception of why general good practices exist, and they often will make up their own rules based on their own quirks. In my previous example, the developer in question was the author of a codebase that was in literal development hell, but he was adamant on not changing his ways. I’d definitely be wary of hiring someone who exclusively worked with scripting language, and sometime it is less work to train someone who is a blank slate rather than try to deprogram years of bad habits.
Are you referring to Python and JS as scripting languages? The two most popular languages on the planet? Ones which are capable of building almost any kind of app imaginable? Surely you don’t apply your limited experience with a single dev to a group of millions of developers doing extremely varied things, right?
Python and Js are by definition scripting languages in the classical sense. I am not using the term in a derogatory way and I myself learnt programming this way as a 90s kid. No offense but I think you took my comment way too personal.
What is the “classical” sense? What are you implying when you say they are “scripting” languages? What you are imparting to me is that they are less-than other, real languages. I don’t take personal offense, but I do take issue with the mischaracterization and implication that those languages are somehow less serious or less broadly useful.
No hard feelins! (:
https://en.m.wikipedia.org/wiki/Scripting_language
A scripting language, or interpreted language, is interpreted at runtime, rather than compiled.
It is not derogatory, and is simply a fact about languages like Python and JS.
If someone on the internet calls something a “scripting language,” it’s hard to take that in a vacuum. I’ll accept that there is overlap between “interpreted” and “scripting” languages, but they aren’t synonymous, particularly in my experience interacting with developers online. The typical discourse does indeed trivialize the so-called scripting languages, and my only intent is to say that they are a lot more than what they began as.
There are definitely people out there shitting on all sort of languages, and JS is a huge target, but those have been referred to as scripting language for as long as they existed. It stern from the fact those languages are embedded into existing applications, as opposed to being built into binaries. Nowadays you have hybrids like C# which can used as either a scripting language or to build native app (or in-betwee), so it is really just a matter of the context you’re using the language in. There is inherently no hidden meaning or elitism in the term. It is a very old term and I think you simply got the wrong impression from your internet experiences. It is how those languages are defined basically everywhere. Even some of those languages official definition from their own website self-define as scripting languages. There is no ambiguity here at all.
Most scripting languages are interpreted, not compiled. It’s not a criticism of them, but it is a tradeoff that is good to understand.
It seems like you are the one who is conflating terms like “script kiddie” with “scripting language” and adding some negative connotation that isn’t necessarily implied.
Scripting languages are usually easier to learn, have simpler syntax, and abstractions that hide complexity. These make them easier to get started in, but the downside is they are generally slower (performance-wise) than their compiled counterparts.
Kinda sounds like they’re adamant about not changing their ways in response to things not working as they expect.
I’d change this slightly - the problem isn’t exclusively working in scripting languages, but dynamically typed ones. There are people who write great code in Python (with typing) and in Typescript, and they usually can work well in other languages too. But people who don’t type their programs are in my experience simply bad developers, the way you describe.
True that, this was pretty much the intended meaning of my reply but you worded it better.
Ah, good!
I feel like there is a fundamental difference between developers with a data-centric perspective, and a function-centric perspective.
The function-centric one is about adding functionality, and it’s what developers start out with. You have functions that do things, and if requirements change or the thing should be re-used - no problem, I can quickly add a new toggle parameter here or bolt it on over there. I’ll be done in 5 minutes, no problem!
Then, over time, you learn that functionality isn’t that interesting or difficult. Instead, the hard parts are the ones concerning the flow of data through your application. What do I know about the shape of my data in this part of my application? What can I be sure of regarding invariants over there? This forces you to build modular software without interdependencies, because - in the end - you just build a library that has small adapters to the outside world.
I like scripting languages a lot, but it’s way too easy to become “good” at that style of programming, and the better you get at it, the harder it will be to actually move forward to a data-centric perspective. It’s a local maximum that can trap people, sometimes for their whole career. That’s why I try to look at typing experience when evaluating candidates for positions.
I wanted to get back to you, because you are so very right, and I spent the last 10 years or so trying to evangelize the fact that implementing algorithm and logic isn’t the hard part, it is a trivial concern really. Everything that go wrong with development usually involve the flow of data, and figuring out how to get this data from over here to over there without making a big mess. To do that, you absolutely need to write small module with few dependencies. You gotta think about the life-cycle of your objects, and generally follow all the principles of s.o.l.i.d if you’re doing OOP. Personally, I really love using dependency injection when the project allows for it.
It is as you said really, you can have thousands of hours of programming experience but if you never tried to solve those issues you’re really limiting yourself. Some devs think designing software around your data instead of your algorithms is overthinking it, or “overengineering” as I have been told. Well, I would not hire those people for sure.
I have seen clean project made up of small modules, with clear boundaries between data, functions and the lifecycle configurations. It is night and day compared to most code bases. It is really striking just how much of the hidden, and not-so-hidden complexity and goo and hacks and big-ass functions in most code base really just exist because the application life cycle management is often non-existent. In a “proper” code base, you shouldn’t have to wonder how to fetch a dependency, or if an object is initialized and valid, and where to instantiate your module, or even what constructor to invoke to build a new object. This take care of so much useless code it is insane.
To close on this, I like scripting languages a lot as well, and you can do great things with some of them even if lot of developers don’t. JS has Typescript, ReactiveX, dependency injection framework, and etc. It is a great language with a lot of possibility, and you’re not forced into OOP which I think is great (OOP and functional programming are orthogonal solutions imo). But the reality is that the language is really easy to misuse and you can definitely learn bad traits from it. Same as you, I would be wary of a developer with no experience with strongly-typed languages, or at the very least TS. I am very happy to hear this take randomly on the internet, because in my experience, this is not how most developers operate, and imo it is demonstrably wrong to not design applications around your data.
You put it very well!
I freaking love you and I’ll try to write a worthy reply when I am home.
<3
That’s true.
It’s also true in other fields. For example, take far-eastern fighting skills:
Young students will try to hit someone, to beat someone up, to hit a target, to become “stronger”.
Experienced teachers, however, don’t really care about hitting a target. It’s all about the posture. How you stand. How you carry out your movements.
That guy was shitty at Python, then. Python is all about EAFP instead of LBYL.
Eat ass fast paced instead of lay back your lettuce?
“Easier to Ask Forgiveness than Permission” vs. “Look Before You Leap.”
In other words, in Python you should just write the code to do the thing and then put an exception handler at the bottom instead of cluttering up your function with guard code everywhere.
I wouldn’t call this a “python thing”.
I grew up with C and C/++ is still my main language, checking for empty strings is instinctive to me. It’s cheap insurance and what does it cost, a couple cycles?
Though you won’t find me using bare cstrings these days unless there is a damn good reason for it. So much extra work to handle them. Even in embedded work, String classes have superceded them.
deleted by creator
You called?