> Programmers program computers. We call them programmers because they program.
You've yet to give a definition of "program" I cannot satisfy with HTML, CSS and Javascript. The notion of "non-trivial" is not rigorous.
> What other definition of "programmer" could you possibly be using?
This is what I am wondering, because by any rational examination of what you've presented you've refuted your own claim. HTML+CSS is turing complete, which in practice means I can "program" arbitrary logic in it. It is a dialect of computer programming heavily focused on displaying content, but clearly it can do things.
Clearly explain to me: why is making a modern webpage any different from writing a simple modern Ruby script? Or a trivial iPhone app, for that matter? Open up a nib file and you'll see a serialized object graph, heavily biased towards the presentation and its structure. Callbacks are annotated into the system. But you could make a "dumb" iPhone app without ever writing a single line of code, using drag and drop components and GUI clicks. Oh my, that seems even less like programming, but in many cases it's indistinguishable from programming.
You really need to step back and ask yourself, "What is the actual difference here?" Your primary justification so far has been about people, but your original claim was about what "programming" is. If a "programmer" is someone who programs computers and we cannot cleanly and clearly explain why webdev is not programming, then what?
And to be honest, it's sort of patronizing and discouraging to new users. HTML+CSS+JS is the most widely deployed, used, programed, customized, and maintained programming environment in human history. Most people will find their way to the discipline via web browsers now, just as most kids in my childhood did via BASIC (also a victim of aspersions like what you cast).
Methods change and evolve. Environments change and evolve. Most of all, requirements change and evolve.
You've yet to give a definition of "program" I cannot satisfy with HTML, CSS and Javascript. The notion of "non-trivial" is not rigorous.
> What other definition of "programmer" could you possibly be using?
This is what I am wondering, because by any rational examination of what you've presented you've refuted your own claim. HTML+CSS is turing complete, which in practice means I can "program" arbitrary logic in it. It is a dialect of computer programming heavily focused on displaying content, but clearly it can do things.
Clearly explain to me: why is making a modern webpage any different from writing a simple modern Ruby script? Or a trivial iPhone app, for that matter? Open up a nib file and you'll see a serialized object graph, heavily biased towards the presentation and its structure. Callbacks are annotated into the system. But you could make a "dumb" iPhone app without ever writing a single line of code, using drag and drop components and GUI clicks. Oh my, that seems even less like programming, but in many cases it's indistinguishable from programming.
You really need to step back and ask yourself, "What is the actual difference here?" Your primary justification so far has been about people, but your original claim was about what "programming" is. If a "programmer" is someone who programs computers and we cannot cleanly and clearly explain why webdev is not programming, then what?
And to be honest, it's sort of patronizing and discouraging to new users. HTML+CSS+JS is the most widely deployed, used, programed, customized, and maintained programming environment in human history. Most people will find their way to the discipline via web browsers now, just as most kids in my childhood did via BASIC (also a victim of aspersions like what you cast).
Methods change and evolve. Environments change and evolve. Most of all, requirements change and evolve.