Can we please list major developments in Programming Language Theory Since 2000?

Not_OlesNot_Oles Hosting ProviderContent Writer
edited December 2022 in Help

Yesterday I bounced through a whole bunch of web searches about Programming Language Theory ("PLT"). I grabbed several textbook pdfs. Three especially interesting things I discovered from yesterday's searches were:

  • The Dragon Wizard Book has a new 2022 edition which uses Javascript instead of Scheme!

  • Could it be true that there is not a new (2000s) university course textbook on PLT (as opposed to the above new edition of a classic)? (But, what about John Guttag's Introduction to Computation and Programming Using Python, Second Edition, MIT Press, 2016?)

  • The Wikipedia article on Programming Language Theory ends its historical discussion with the 1990s!

Can you please help list major developments in PLT since 2000? What should the list contain?

Are there 2000 era university course textbooks on PLT?

Thanks in advance! :)

I hope everyone gets the servers they want!

Comments

  • NekkiNekki OG
    edited December 2022

    You're very demanding.

    Is this for an LET article?

    Thanked by (1)_MS_
  • Yep, probably offloading paid work on community. I observe Oles a lot. Both on OGF and here. He degrading slowly, undeniably. I should write philosophical treatise about him. There is a void, deep inside.

    Thanked by (1)Nekki
  • PLT is a totally different subject than PL software. The main post-2000 development I know of is homotopy type theory (HoTT). What is that? I don't know either, but there is a book about it at https://github.com/HoTT/book . All the other pointy headed stuff like dependent types, linear types, etc. were around pre-2000 even if they weren't deployed in software.

    A general intro to the topic of PLT is here: https://www.cs.cmu.edu/~rwh/pfpl/

    That is a nerdy book, but PLT is a nerdy subject.

    Thanked by (1)Not_Oles
  • @Not_Oles said: The Dragon Book has a new 2022 edition which uses Javascript instead of Scheme!

    Aren't those completely different books? "Structure and Interpretation of Computer Programs, JavaScript Edition" and "Compilers: Principles, Techniques, and Tools" (Dragon Book)

    Thanked by (1)Not_Oles
  • Not_OlesNot_Oles Hosting ProviderContent Writer

    @cmeerw said:

    @Not_Oles said: The Dragon Book has a new 2022 edition which uses Javascript instead of Scheme!

    Aren't those completely different books? "Structure and Interpretation of Computer Programs, JavaScript Edition" and "Compilers: Principles, Techniques, and Tools" (Dragon Book)

    My bad! Fixed. Thanks! :)

    Reference: SICP book "is known as the "Wizard Book" in hacker culture. . . ."

    I hope everyone gets the servers they want!

  • @Nekki said:
    Is this for an LET article?

    No, it's for an article on London Ex Boyfriends.

    Thanked by (1)Nekki

    ServerFactory aff best VPS; HostBrr aff best storage.

  • MasonMason AdministratorOG

    @Nekki said:
    You're very demanding.

    Is this for an LET article?

    .

    @legendary said:
    Yep, probably offloading paid work on community. I observe Oles a lot. Both on OGF and here. He degrading slowly, undeniably. I should write philosophical treatise about him. There is a void, deep inside.

    .

    @yoursunny said:

    @Nekki said:
    Is this for an LET article?

    No, it's for an article on London Ex Boyfriends.

    Can we cool it with the character attacks, guys? Not really needed. He's not writing articles for LEB. Just a guy trying to learn new things and pick some brains.

    Head Janitor @ LES • AboutRulesSupport

  • @Mason said:

    @Nekki said:
    You're very demanding.

    Is this for an LET article?

    Can we cool it with the character attacks, guys? Not really needed. He's not writing articles for LEB. Just a guy trying to learn new things and pick some brains.

    You call it a character attack, I call it responding to a very strangely worded post with appropriate banter and following up with an oldie but a goodie.

    Why can’t we take the michael out of someone without it being misconstrued as an ‘attack’? I thought I was viewed as a loveable rogue rather than a scallywag with malicious intention.

    This country has gone to the dogs, to the dogs I tell ya!

  • @yoursunny said:

    @Nekki said:
    Is this for an LET article?

    No, it's for an article on London Ex Boyfriends.

    @Mason FWIW this comment isn’t anything to do with DOT!, this is @yoursunny pouring salt in the wound that is our acrimonious break-up in the wake of my affair with @Murv last BF/CM. I don’t see moderators rushing to my defence so for such blatant psychological abuse.

    Thanked by (2)yoursunny Murv
  • MasonMason AdministratorOG
    edited December 2022

    @Nekki said:

    @Mason said:

    @Nekki said:
    You're very demanding.

    Is this for an LET article?

    Can we cool it with the character attacks, guys? Not really needed. He's not writing articles for LEB. Just a guy trying to learn new things and pick some brains.

    You call it a character attack, I call it responding to a very strangely worded post with appropriate banter and following up with an oldie but a goodie.

    Why can’t we take the michael out of someone without it being misconstrued as an ‘attack’? I thought I was viewed as a loveable rogue rather than a scallywag with malicious intention.

    This country has gone to the dogs, to the dogs I tell ya!

    Not saying you have any malicious intent. I just think there's a difference between some bants and multiple hecklers piling on and possibly stifling any on topic conversation.

    I'm all for having fun and shooting the shit with you guys. But the last thing I want for LES is for topics like these to be drowned out by negative comments that detract from whatever legit conversations may have happened. Just compare help/technical threads on LET from 5 years ago to today to really showcase how allowing that to happen introduces a rot to the community that gets out of hand if not addressed.

    I'm also not trying to overmoderate and censor anyone here which is why I haven't removed/edited any comments and am just trying to gently steer things back on track. Hope you understand, our steadfast, loveable rouge.

    Thanked by (3)bikegremlin pikachu rcy026

    Head Janitor @ LES • AboutRulesSupport

  • so @Nekki about that storage server, figure out the weirdness yet?

  • @Not_Oles You trying to learn something about software development? Nowadays beginners courses use python or Javascript instead of c++ to teach theory.

    If you are getting into coding and want actual theory, don't forget the algorithms courses. Will make you a better developer.

  • Not_OlesNot_Oles Hosting ProviderContent Writer
    edited December 2022

    @Hxxx said: @Not_Oles You trying to learn something about software development?

    Maybe I am neither trying to become a "software developer" nor trying to learn about "software development." Now that I am retired I have time to try understanding a little about how the computers that I use actually work "under the hood." I decided to look at MIT's online Operating Systems course, which uses the Xv6 teaching OS. Of course, Xv6 is written in C. And C is a programming language.

    When I started skimming the MIT course's Xv6 Book, I got the idea to compare the Xv6 Book with its famous predecessor, the Lions Book. When I first glanced at the Lions Book, I saw chapters on assembly language and on C. The Lions Book Chapter 3 on C sent me to the “C Reference Manual,” by Dennis Ritchie, and to “Programming in C – A Tutorial”, by Brian Kernighan. Maybe everybody who has studied C knows that Ritchie's C Reference Manual later became Appendix A in the K&R book.

    Next I happened to look at Dennis Ritchie's website and at Ritchie's thesis. I imagined that the stylistic differences between the practical "K" part of K&R and the perhaps more theoretical "R" part, Appendix A, might be two different ways to "define" a programming language. I mean, conceivably, a programming language could be "defined" as a set of practical routines, or, alternatively, defined by a "standard," which Appendix A apparently was for awhile, followed by the several more official C Standards.

    Next, looking briefly at Ritchie's thesis made me imagine that Ritchie might have had a more "mathematical" definition of C than the Appendix A definition. Before C, and before Bell Labs, Ritchie was studying applied mathematics. So it wouldn't be surprising if Ritchie thought about C mathematically. But it doesn't seem like Ritchie ever wrote an explicitly "mathematical" definition of C. I wonder, however, to what extent the assembly and machine language routines underlying C might qualify as "mathematics."

    I began looking around to see what a definition of a programming language should look like. I found PLT as a subject area. I found a lot of textbooks that maybe I or someone could read, but I didn't see many textbooks written after the 1990s. Hence this post, asking. . . .

    Thanks for your question, @Hxxx! I will remember your advice about the algorithms courses. :)

    Best wishes and kindest regards! :)

    Thanked by (2)Hxxx bikegremlin

    I hope everyone gets the servers they want!

  • For me, the biggest shift in programming languages since 2000 is the popularity of closures. Before 2000, they were quite a niche concept but they're now an essential part of most modern languages.

    Thanked by (2)xleet Not_Oles
  • I wonder you're trying to learn too many things at once :smile:

    I don't remember if I recommended Crafting Interpreters in our last conversation, I'd first suggest working through it in full so that you get a handle on the basics of compiler design before picking up advanced concepts.

    Thanked by (3)xleet Not_Oles bikegremlin
  • williewillie OG
    edited December 2022

    @Not_Oles said: Next I happened to look at Dennis Ritchie's website and at Ritchie's thesis...

    I think that would have been written in the 1960s or 70s. Stuff was much different then. Are you trying to learn actual PLT, or just pick up some languages? PLT is an abstract and pointy headed subject compared to most programming. Languages lke C were designed by people who (while they were smart) were completely clueless about what we would now call PLT. The semi-mainstream languages with the most PLT influence in them them are Haskell, Rust, Scala, and Purescript (idk about Typescript). Recent versions of C++ are also getting some PLT input, but C++ is still a huge bloaty mess from so many iterations having to keep historical compatibility.

    Without for a moment trying to diminish Ritchie's genius, C by today's standards is a terrible language. If you want to write an operating system, maybe consider Rust or even Ada.

    Thanked by (1)Not_Oles
  • @Not_Oles said: Maybe I am neither trying to become a "software developer" nor trying to learn about "software development." Now that I am retired I have time to try understanding a little about how the computers that I use actually work "under the hood." I decided to look at MIT's online Operating Systems course, which uses the Xv6 teaching OS. Of course, Xv6 is written in C. And C is a programming language.

    would highly recommend the books, 'The little Schemer' & 'The little typer', it feels like you are treated like a 5 year old as the book is in the form of question and answers. but it gets the basics correct.

    PLT hasn't changed much because the underlying hardware hasn't changed much, the transition to analog will likely bring in a lot of new things.

    Thanked by (1)Not_Oles
  • @evnix said:

    would highly recommend the books, 'The little Schemer' & 'The little typer', it feels like you are treated like a 5 year old as the book is in the form of question and answers. but it gets the basics correct.

    PLT hasn't changed much because the underlying hardware hasn't changed much, the transition to analog will likely bring in a lot of new things.

    I think of analog computers as really old technology that never went anywhere. I played with an analog computer when I was a kid, but it was a useless toy. Can you provide a link to a summary article about this future transition to analog?

    Could you have meant "quantum computer" instead? There is a lot going on in that field.

    Thanked by (1)Not_Oles
  • @xleet said: Could you have meant "quantum computer" instead?

    No, Quantum computers do have a future, but currently they are a big scam at most research institutes just trying to get investor money.

    @xleet said: I think of analog computers as really old technology that never went anywhere.

    Your brain is an Analog computer, All Neural nets try and simulate an analog machine.
    Neural nets although amazing need huge data centers, high end GPUs and lots of power to train, wheras your brain is better and can run on a $2 MCDonald burger or less than 20 watts of power.

    Digital Computing is slow and very power inefficient for anything AI/ML related. It is only useful when you need correctness/preciseness. But in real life most things don't have to be precise, they have to be only be accurate to a degree.

    You can read about the newer Neuromorphic chips or you can read about the Intels Loihi chips (If you are at a University, most comp Sci. departments will have people working on these)

    https://en.wikichip.org/wiki/intel/loihi

    You must have relaized by now that most digital computers also mimic analog machines, why we did not go full analog was because it was expensive as they were rigid and it is only now that we are able to morph hardware programitically (like our brains).

    Thanked by (2)bikegremlin xleet
Sign In or Register to comment.