I was thinking a while back, how nice it would be if lua was the scripting language in the browser instead of javascript. There are some projects to compile lua to wasm and have it run in the browser...
But interoperability with the DOM is the missing key.
Still, if lua was used instead of javascript, I could see myself saying... man, I wonder what browser development would be like if we replaced lua with x.
What specifically do you think would be better? Lua shares many of JS's quirks (like the relationship between arrays and non-array objects, the behavior of undefined for non-existent object properties, metatables are somewhat similar to JS prototypes, etc.) and adds a bunch more (lack of continue statement, 1-indexing, cannot have nil values in tables).
I can see people liking or disliking Lua and JS both, depending on taste, but it's hard to see someone liking one and disliking the other.
Lua has tail call optimization and js doesn't. For me, this is a dealbreaker for js.
Lua also has operational advantages compared to javascript. You can build it from source in at most a few seconds and run it anywhere that has a c compiler. The startup time is negligible even compared to a compiled c program so you can run entire scripts that do very useful things faster than most js engines can print hello world. There is also the benefit that in spite of what this article discusses, it is possible for most problems to write a lua solution that works for all lua >= 5.1. This is wonderful because it means if I stick to the standard library, the bitrot is essentially zero.
Calling into c is very straightforward in Lua, especially with luajit, which makes it superior to js as a scripting language (the node ffi api is quite painful in my experience and ffi runs against the js execution model).
Lua also essentially got lexical scoping of local variables correct from the beginning while js blessed us with the nightmare of var.
Of course Lua is far from perfect, but while there are some similarities, the differences are quite significant and meaningful.
> Lua also essentially got lexical scoping of local variables correct from the beginning while js blessed us with the nightmare of var.
That was not my experience when I was working with lua. Did anything change since? Asked google. Answered :
> In Lua, if a variable is assigned a value inside a function without being explicitly declared with the local keyword, it will automatically become a global variable. This is because, by default, all variables in Lua are global unless explicitly specified as local.
Yeah, this puzzled me too. I'm assuming they're referring to the semantics of "var" in JS vs "local" in Lua, with the latter resembling "let" in JS, which doesn't have broken scoping.
> Calling into c is very straightforward in Lua, especially with luajit, which makes it superior to js as a scripting language (the node ffi api is quite painful in my experience and ffi runs against the js execution model).
As a scripting language for browsers this is an antifeature, and the fact Lua comes by default with a bunch of features that allow loading arbitrary binary code in the process makes it pretty annoying to properly use it as a sandboxed scripting language.
You're conflating a few things here. V8 and Spidermonkey aren't the only interpreters out there. There are a number of them explicitly designed to be small, easy to compile and to embed, such MuJS, Fabrice Bellard's QuickJS, and more still. I can't speak to their FFI interfaces, but you can't judge JS's ability to call C functions based off that of Node/V8. I'm not sure how FFI runs against its execution model given JS is generally used as an embedded language, necessitating foreign function calls.
If I remember correctly, TCO is now part of the ECMAScript standard. Safari has implemented it. The issue is that others engines have not because they are concerned about stack unwinding and stacktraces.
I agree mostly in that Lua and Javascript are both similar, and like I said in my post above, I could see myself saying the exact opposite if Lua had been included in the browser.
The things I do not like about Javascript can easily be shot down in an argument. Some of it was having to work with Javascript (and it's evil cousin JScript) in the 90s and early 00s.
The type coercion and in the early days people used '=='. I think === did not even appear until ie6?
[] == ![] // true
The lack of a lot of easy helper functions in the standard lib. That now are provided by people writing a bunch of stuff in the npm ecosystem.
The npm ecosystem itself. Lack of security. Lack of... curation? (Also, all this would have probably happened anyway if Lua was in the browser)
I also think javascripts long history has created a legacy of different paradigms
variable declaration
var, let, const
function declaration
function f1() {}
const f2 = function() {};
const f3 = () => {};
const obj = { f4() {} };
There is a lot of stuff like this in javascript. I could probably make a blog post about it. But the above gives the general idea of my complaints.
In practice you don't run into these issues often. I'm annoyed when you see different function declaration conventions in the same codebase, but generally () => is used for either one line functions or inline lambdas, and function foo(){} for everything else. Nobody uses var anymore.
The implicit conversions is a definite footgun tho.
Bundlers will convert let/const to var, assign classes and functions to var etc but generally people don't write it themselves unless they want to (ab)use its semantics for performance reasons.
For me, JS has just too much magic, in particular the behavior of 'this', and lots of weird quirks, like "for in" vs. "for of". Lua, on the contrary, is very predictable and therfore easy to understand.
One killer feature of Lua (that surprisingly few scripting languages have) is stackful coroutines, i.e. you can yield across nested stack frames. Unlike JS or Python, there is no artificial split between generators and async/await and no need for function coloring.
If Lua had zero-based indexing, it would be close to perfect :)
Have you read large Lua codebases written by others? It is write-only language. All your "very predictable" features are overloadable, at runtime. No static typing to rest your eye on. It is a swamp.
"But, just write good code" you will say. Just like with Perl, some languages are designed in a way to discourage writing good code...
I'm talking about the core language, which I do find very predictable. You can go crazy with any language. Lua is definitely not worse than Python or JS in this regard.
> No static typing to rest your eye on.
That goes for any dynamically typed language. How is that an argument against Lua in particular?
> Have you read large Lua codebases written by others?
No, because I use it as a scripting language, as intended. I totally agree that one shouldn't use dynamically typed languages for building large applications. But again, this is not specific to Lua.
Oh, as a scripting language you embed into your project so that you can write scripts for it — there is hardly anything better than Lua. The C code is super clean and easy to embed and modify.
But once that project gets passed to next maintainer — I'm not sure I'd pick Lua over Forth or Scheme.
I held a similar opinion several years ago. The main thing is that lua has less magic than js largely because it's been allowed to break compatibility.
My main example is self in lua which is just the first argument of a function with some syntactic sugar vs this in javascript which especially before 'bind' often tripped people up. The coercion rules are also simpler largely by virtue of making 0 true.
Lua's metatables also cover a fair bit more ground than JS. For example, indexing, math operations (including bitwise), comparisons and equality can be overriden.
Some of thr language features like operators can be overloaded using method tables, much like python's double underscore methods. I do think that the inability to do this in JS is what held it back from becoming a data processing powerhouse. But it perhaps can be made to run fast precisely because of the lack of such overrides (otherwise we'll be running python at JS speeds). That said, LuaJIT seems like a good compromise and might be best of both worlds.
I love JS and Ruby, and I spent a bit of time maintaining a web app in Lua.
There are parts about the language I really enjoy. I like dynamic languages; I don't run into type issues so often it bothers me, but the tooling around Lua still leaves me wishing for me.
Part of this is really my own ability, or lack thereof, but I feel slow and constrained in lua. I have a hard time debugging, but I don't feel these things in other languages.
Yeah. I think if I were to pick one reason, it would be Lua is "minimalist". But, like I wrote, maybe, it too, would have just had lots of stuff added onto it if it had been in the browser. Hard to say.
I love ++, but you know what? I was shocked when a co-worker pointed out that it is frowned upon to use ++ in javascript. It some big companies, their linter settings mark ++ as something that should be changed.
The origin of this rule was that Douglas Crockford doesn't like ++.
From JavaScript: The Good Parts (May 2008):
> The ++ (increment) and -- (decrement) operators have been known to contribute to bad code by encouraging excessive trickiness. They are second only to faulty architecture in enabling viruses and other security menaces. The JSLint option plusplus allows the use of these operators.
Needless to say, I'm with you and hackthemack on this.
I don’t know if this can access the DOM in Lua, but considering that Fengari is in Javascript, adding a _DOM global variable should not be too hard (if it hasn’t already been done).
local js = require "js"
local window = js.global
local document = window.document
window:addEventListener("load", function()
local el = document:getElementById("main")
el:addEventListener("click", function(evt) js.console:log(evt.target) end
document.body:appendChild(el)
end
IIRC Brendan Eich has talked about this - if he’d adopted Lua in 1995 instead of creating JavaScript, it wouldn’t have been Lua 5.x but Lua 2.x.
Lua has improved substantially from version to version because it’s been able to break compatibility. That wouldn’t be possible in the browser, so today we’d still be stuck using (an augmented version of) the outdated Lua 2.x.
“Lua in 1995 was very different, no coros e.g., and no one would be happy if it flash-froze and then slow-forked on a different path from Lua's. See https://news.ycombinator.com/item?id=1905155 and yes, wasm is the right long-term plan. In 1995 it was supposed to be Java, but that didn't pan out!”
It is interesting that so many of the largest languages were developed in a couple year time frame in the early-mid 90s. Python, Javascript, Java, Lua, R. All of these were developed 91-95 and make a bulk of development today.
I'm sure there are a few unrelated factors going on to influence that plus some inadvertent cherry-picking. But I do think there is a thing to the observation too.
If I had to make a guess, I'd point to a combination of:
1. This was right around the time that computers were fast enough to afford the runtime cost of garbage collection while still delivering usable apps. The GC languages before the 90s (Lisp, Scheme, Smalltalk, Self) had reputations for abysmal performance and have largely (but not entirely) died out.
2. This was also the boom of the Internet and web which shifted a lot of computation to server-side after the PC boom had shifted it client-side. That also enabled somewhat slower languages with GC to thrive because first-mover advantage was more valuable than using your servers efficiently. You could just throw more hardware at it.
3. That boom also created a huge influx of new developers who picked up whatever language happened to be hot at the time. And once they had, they tended to stick with it. I can't find a good graph of the number of software engineers over time but I'd bet that there's a noticeable uptick around the dotcom boom.
After mulling it over a bit, plus seeing a few other response (Php, Ruby, etc) I think the internet is the reason.
C, C++, Ada, Basic, Fortran. Those were written assuming they were going to be writing systems code and numerical code. There is then some inertia keeping the most successful ones of from those periods going in their spaces.
Then these new languages with new ideas (GCs, Dynamic Typing, etc) they came out and became successful in the newer Web and Application spaces. Now why they won out and not LISP or SmallTalk or what have you, I am not sure. But my hypothesis is that the web is a big part of it.
Languages developed in that time matured just as good binary package managers started popping up, is my pet theory. Before that, getting a development environment for a new language was serious work, and people did things like stick to Perl since it happened to be installed already.
I was programming in the 90s when these languages emerged. Developments environments were emacs, vi, Brief, Borland IDE, etc. There were a few other IDEs available, but about $200 per seat.
All the scripting languages mentioned didn't come as default in Unix or Windows. You had to download from their own websites.
It was mostly Visual Basic, C, COBOL that were popular.
I think that's what I mean. After the time you talk about ('90s), these languages matured, and they happened to mature around the same time binary package managers became a thing, i.e. in the early-to-mid '00s.
>Elk, the Extension Language Kit, is a Scheme implementation that is intended to be used as a general,
reusable extension language subsystem for integration
into existing and future applications. Applications can
define their own Scheme data types and primitives,
providing for a tightly-knit integration of the CIC++
parts of the application with Scheme code. Library interfaces, for example to the UNIX operating system
and to various X window system libraries, show the
effectiveness of this approach. Several features of Elk
such as dynamic loading of object files and freezing
of fully customized applications into executables (implemented for those UNIX environments where it was
feasible) increase its usability as the backbone of a
complex application. Elk has been used in this way
for seven years within a locally-developed ODA-based
multimedia document editor; it has been used in numerous other projects after it could be made freely
available five years ago.
>GNU Ubiquitous Intelligent Language for Extensions[3] (GNU Guile) is the preferred extension language system for the GNU Project[4] and features an implementation of the programming language Scheme. Its first version was released in 1993.[1] In addition to large parts of Scheme standards, Guile Scheme includes modularized extensions for many different programming tasks.[5][6]
>Winterp is an interactive, language-based user-interface and application-construction environment enabling rapid prototyping of applications with graphical user interfaces based on the OSF/Motif UI Toolkit. Winterp also serves as a customization environment for delivered applications by providing a real programming language as an extension language. Many existing user-interface languages only have the expressive power to describe static layout of user interface forms; by using a high-level language for extensions and prototyping, Winterp also handles the dynamic aspects of UI presentation, e.g. the use of direct manipulation, browsers, and dialog. Winterp makes rapid prototyping possible because its language is based on an interpreter, thereby enabling interactive construction of application functionality and giving immediate feedback on incremental changes.Winterp's language is based on David Betz's public domain Xlisp interpreter which features a subset of Common Lisp's functionality. The language is extensible, permitting new Lisp primitives to be added in the C language and allowing hybrid implementations constructed from interpreted Lisp and compiled C. Hybrid implementation gives Winterp-based applications the successful extension and rapid-prototyping capabilities of Lisp-based environments, while delivering the multiprocessing perfor- mance of C applications running on personal Unix workstations.
>The Visix Galaxy project was a ridiculously overpriced and overfeatured portable GraphicalUserInterface. You could do things like swivel an entire panel full of their custom widgets 32 degrees clockwise, and it would render all its text at this new angle without jaggies. The company went out of business after gaining only a handful of customers. For USD$ 10,000 a seat they sure didn't see the OpenSource movement coming. Their last attempt before going under was (guess what?) a Java IDE.
Galaxy competed with Neuron Data Systems in the "cross platform gui framework" space (which got steamrolled by the web permanently and for a window of time Java):
Java 1996, C++ only got standardized in 1998, C in 1990 (tehcnically the standard is from 1989, but there was a short retification in 1990), Delphi is from 1995 (not that big player nowadays, but plenty of its influences live on C#, Typescript and Kotlin).
It goes to show how much investment is required for a programming language to actually take off at scale.
However in a couple of years, we will be asking the computers to perform tasks for us and the actually compiler frontend will be irrelevant to the AI runtime.
The language of R is S, which originated at Bell Labs in 01976. Python began development in 01989, although Guido didn't release it until 01991. And the top 20 on https://www.tiobe.com/tiobe-index/ are Python, C (01972?), C++ (01982?), Java, C# (01999? though arguably it's just a dialect of Java), JS, Visual Basic (first released 01991, within your window), Golang (02007), Delphi (under this name in 01995 but a dialect of Object Pascal from 01986, in turn a dialect of Pascal, from 01970), SQL (01973), Fortran (01957), Perl (01987), R, PHP (01995, within your window!), assembly (01947), Rust (02006), MATLAB/Octave (01984), Scratch (! 02003), Ada (01978?), and Kotlin (02011).
By decade, that's one language from the 40s, one language from the 50s, no languages from the 60s, 5 languages from the 70s, 5 languages from the 80s, 4 languages from the 90s, 3 languages from 0200x, one language from the 02010s, and no languages from the 02020s.
Lua is #33 on TIOBE's list, but given its prevalence in Roblox (as Luau), WoW, and other games, I suspect it should be much higher.
For some reason, CUDA (a dialect of C++) and shader languages like GLSL don't show up in the list at all.
— ⁂ —
I think most of what's going on here is that it takes a new language a long time to get good, and it takes a new good language a long time to get popular. Perl, Python, Java, PHP, and JS became popular because of the Web; https://philip.greenspun.com/panda/server-programming explains why Perl, Python, and PHP did, and of course Java and JS became popular because they were the only languages you could make interactive web pages in:
> You would think that picking a Web site development language would be trivial. Obviously the best languages are safe and incorporate powerful object systems. So let's do everything in Common Lisp or Java. Common Lisp can run interpeted as well as compiled, which makes it a more efficient language for developers. So Common Lisp should be the obvious winner of the Web server language wars. Yet nobody uses Common Lisp for server-side scripting. Is that because Java-the-hype-king has crushed it? No. In fact, to a first approximation, nobody uses Java for server-side scripting. Almost everyone is using simple interpreted languages such as Visual Basic, PHP, Perl, or Tcl.
> How could a lame string-oriented scripting language possibly compete in power with systems programming languages? Well, guess what? The only data type that you can write to a Web browser is a string. And all the information from the relational database management system on which you are relying comes back to to the Web server program as strings. So maybe it doesn't matter whether your scripting language has an enfeebled type system.
Some people think that writing years as 2025 is wrong because this will lead to problems in year 9999 (y10k bug? I'm not sure if they call it that way) so they decided to introduce leading zero as it would solve something and not just postpone the problem to 99999.
- we will still be using the same calendar system in 8000 years
- people 8000 years in the future will leave off the leading 1 of years for some reason, and will use a leading 0 to disambiguate dates from the previous 10000 year period.
I would say it is a symbolic reminder to care about the long term consequences of our actions. In the same way we have holidays to remind us about the environment or mortality.
That's silly. The y2k bug was because the year was written as 65, instead of the full year being 1965, so information was lost. Writing 2025 has no missing information.
"The present moment used to be the unimaginable future." -Steward Brand
"How can we invest in a future we know is structurally incapable of keeping faith with its past? The digital industries must shift from being the main source of society’s ever-shortening attention span to becoming a reliable guarantor of long-term perspective. We’ll know that shift has happened when programmers begin to anticipate the Year 10,000 Problem, and assign five digits instead of four to year dates. 01998 they’ll write, at first frivolously, then seriously." -Steward Brand
I do agree with your point but I also think that there is a lot of inertia in the sector (rightfully so!) and it is very difficult for languages to become established if they don't come with a "unique selling point" of sorts, which to me explains how new popular languages have become rarer.
That selling point, for Lua, is the super easy integration via C-API to me (=> making existing compiled applications scriptable), thanks to uncomplicated build (dependency free, simple), the straightforward C-API and the ease of exposing C "modules" to Lua.
On a sidenote:
Don't you think that Y10k-safe dates are somewhat inconsistent with referencing previous decades directly? Those dates are also obnoxious to parse for humans (myself, at least).
>C# (01999? though arguably it's just a dialect of Java)
That's like saying Java is a dialect of C++. Java was specifically designed as a "fuck you" to C++, and C# was specifically designed as a "fuck you" to Java.
While at a political level that's reasonable, at both the semantic and the syntactic level, the first version of C# was very close to Java, much closer than the first version of Java was to C++. https://learn.microsoft.com/en-us/dotnet/csharp/whats-new/cs... is a very vague overview.
.NET was being designed with J++, Microsoft's Java extensions, Cool research language only became C# and took over J++'s role in .NET due to Sun's lawsuit.
The lawsuit is more than well known, and the background to .NET planned used of .NET is on the papers published by Don Syme of F# fame, regarding the history of .NET and F# HOPL.
I've known and worked with James Gosling for years before Java (Live Oak), on his earlier projects, Emacs at UniPress and NeWS at Sun, and fought along side him against Sun management trying to make NeWS free in 1990 (and I left Sun because they broke the promises they made us and spilled a lot of blood), so I didn't need to learn about Java's history from Wikipedia.
James's email that convinced me to go work with him at Sun on NeWS in 1990:
Here's a Stanford talk James Gosling gave about Java that I attended in 1995, where he talks about C++, his original tape copy program that turned into a satellite ground control system, how he holds the world record for writing the largest number of cheesy little extension languages to go, and his implementation of Emacs sold by UniPress (which RMS calls "Evil Software Hoarder Emacs"), and his design and implementation of NeWS (formerly SunDew), a PostScript based network extensible window system.
James Gosling - Sun Microsystems - Bringing Behavior to the Internet - 1995-12-1:
>Video of James Gosling's historic talk about Java, "Bringing Behavior to the Internet", presented to Terry Winograd's user interface class at Stanford University, December 1, 1995.
In that talk I asked him a couple questions about security and the "optical illusion attack" that he hedged on (44:53, 1:00:35). (The optical illusion attack is when the attacker simply draws a picture of a "secure" pop up dialog from your bank asking for your password.)
He mentioned off hand how a lot of the command and control systems for Operation Desert Storm was written in PostScript. That was his NeWS dialect of PostScript, and was written primarily by Josh Siegel at LANL called "LGATE", who later came to work at Sun in 1990 and rewrote the NeWS PostScript interpreter himself, then went on to write an X11 window manager in PostScript, again proving James's point that people always did a lot more with his cheesy little extension languages than he ever expected (which also held true with Java).
Josh's work on simulating Desert Storm and WWIII with NeWS at LANL:
I also saw Bill Joy's much earlier talk at the 1986 Sun Users Group in Washington DC, where he announced a hypothetical language he wanted to build called "C++++-=", and that he talked about in subsequent presentations.
I think that was the same talk when Bill said "You can't prove anything about a program written in C or FORTRAN. It's really just Peek and Poke with some syntactic sugar". More Bill Joy quotes:
James eventually realized that concept as Java, showing that the kernel inspiration of writing a "fuck you to C++" language existed long before James invented "Live Oak", even soon after C++ was invented. But "Java" was a much better name than "Live Oak" or "C++++-=" fortunately -- thanks to Kim Polese -- though not as succinct and musically inspired as "C#"!
>The peak computer speed doubles each year and thus is given by a simple function of time. Specifically, S = 2^(Year-1984), in which S is the peak computer speed attained during each year, expressed in MIPS. -Wikipedia, Joy’s law (computing)
>“C++++-= is the new language that is a little more than C++ and a lot less.”
-Bill Joy
>In this talk from 1991, Bill Joy predicts a new hypothetical language that he calls “C++++-=”, which adds some things to C++, and takes away some other things.
>“Java is C++ without the guns, knives, and clubs.” -James Gosling
function set(name, val)
rawset(_G, name, val or false)
end
function exists(name)
if rawget(_G, name) then return true end
return false
end
throwError = {__newindex = function(self,name) error("Unknown global " .. name) end,
__index = function(self,name) error("Unknown global " .. name) end
}
setmetatable(_G,throwError)
Naturally, the main catch is that this only detects the violation at run-time. It also won't stop you from accidentally overwriting a global variable that already exists.
It's because Tcl, like SQLite, operates on a peculiar metaphysical principle: everything is a string until proven otherwise, and even then, it's probably still a string.
Also, D. Richard Happ, who we owe thanks for SQLite, was and perhaps still sits on the TCL Board (I may be wrong about that, but Happ holds significance in the TCL community).
In my mind:
Tcl is the quietly supportive roommate who keeps making coffee and feeding LISP-like functionality until the world finally notices its genius.
Lua sits across the table, sipping espresso with a faintly amused expression, wondering how everyone got so emotionally entangled with their configuration files.
Lua is one of the easiest configuration file formats I've had the pleasure of working with. Readable. Has comments. Variables. Conditionals.
Everyone (including me): "oh no, no, you don't want a full Turing complete language in your configuration file format"
Also Everyone: generating their configuration files with every bespoke templating language dreamed of by gods and men, with other Turing complete languages.
You could solve this with a capabilities permissions system. That way the config files can be written in the same language but have configured permissions that are different from the rest of the programming language. So you could restrict the config files from resources like threads, evaling source, making network requests and whatnot. Come to think of it you could even probably section off parts of the language behind capabilties such that the config files could be configured to be a not-Turing complete subset of the language.
TCL 9 brought some welcome string improvements, and things run faster overall. But in my case, it's hard to say how well that's actually played out, partly because I haven't done the work to find out. My TCL scripts and apps work well enough to allow me to be lazy about them.
Performance is up, but so is my inertia. So while TCL 9 could be transformative, for now it remains a white paper I've skimmed, not a revolution I've implemented.
I think TCL does an opaque thing, everything "is" a string, but if you don't use it as a string, it's actually stored in some optimized format. Then it converts back to a string on demand
I still prefer Lua personally. Their type system is easy for me to understand
Lua is simple and elegant, and I much prefer it to Tcl.
Lua is in games and in LuaTeX, and when you have the choice of embedding a LISP, a FORTH or Lua in a larger application, it is often the most maintainable, runtime-efficient and low-memory footprint option of all.
I enjoy Lua and use it professionally, but when bash (and AWK) don't suffice, the glue is Perl. Because it has pipes which you can use to connect the output of one command to the input of another, or to a file.
What would be the SQLite's equivalent to indexing starting from 1, not 0? Off the top of my head I can't think of anything that would go so much against the grain.
Rebol is the cleanest/greatest language I've read code for but the VM is the slowest VM I've ever wrote code for, mind you I only did the first 10 exercises of euler but the only thing that has it beat is writing a shellscript that forks to dc/bc on each math expression.
My only context in using Lua is my neovim configuration, does any know of any good books or tutorials that make something more advance using only lua? Anything of note to consider/read/watch?
Shameless plug time: I have written a public domain book which looks at using Lua (Lua 5.1 but the code works in newer versions as well as Lua 5.1/Luau/LuaJIT) as a text parsing engine. The book assumes familiarity with other common *NIX scripting languages (such as AWK, Perl, or Python) and goes over in detail the pain points for people coming from a *NIX scripting background:
Thanks for recommendation and writing the book! I'm reading the introduction, it doesn't answer why you chose to fork lua and create lunacy. What were you trying to solve with lunacy that lua couldn't do?
The TOC looks great, I will read this soon. Need to finish "Debugging CSS" first (another good book IMO).
The main reason I made Lunacy was to have a standard compile of Lua 5.1, since it’s possible to make a Lua 5.1 compile with, say 32-bit floats or which only supports integers but not floats.
Lunacy also has a few built in libraries which are not included with Lua 5.1, such as binary bitwise operations (and/or/xor). It also fixes some security issues with Lua 5.1 (better random number generator, hash compression algorithm which is protected from hash flooding attacks).
In addition, I have made a tiny Windows32 binary of Lunacy.
Don’t worry about the Lunacy changes; all of the examples in the book work with bog standard Lua 5.1 with a bit32 library (bit32 is common enough most OSes with a Lua 5.1 package also have a bit32 package for Lua 5.1).
Don't have tutorials or books, but I've had a ton of fun using Lua with LOVE2D [0] for gamedev, and also Redbean [1] for building super small portable web applications. Earlier this year, I built a mini CMS [2] inspired by rwtxt with Redbean.
Haven't heard of love2d but have heard of pico-8. I've avoided most game dev tutorials because they seem overly focused on beginners, which is fine, but want to find more advance materials that assume the reader knows some basics.
Maybe I should reconsider and dive more into game dev.
> It's aimed at programmers who have some experience but are just starting out with game development, or game developers who already have some experience with other languages or frameworks but want to figure out Lua or LÖVE better.
Also on the topic of game engines with lua scripting, the wonderful Defold always deserves a mention https://defold.com/
Definitely include Roberto's Programming in Lua book in your list. Specially if you'd like to script Lua together with C. The book has a good primer on the Lua-C api in its latter half.
I always wrote Lua off, scoffing at the 1-based indexing, until I was "forced" to learn it thanks to Neovim. What a delightful little language it is. I do wish I could do certain things less verbosely (lambdas would be nice) -- but then again, I defeat myself by suggesting it, because not having all the features makes Lua so approachable.
I used Lua professionally. I prefer the 1 indexing... it just feels more natural. For some reason the C apologists here will scream how 0 based is the only way to go. (which is not, it is just a historical artifact). Languages like ADA allowed you to use either 0 or 1, (or any arbitrary) starting index.
Same here, in fact something I wish the neovim team would do is create a book where popular plugin authors create tutorials that recreate basic functionality of their plugins.
Seems like a no brainer that would help bring in more revenue too, it'd also be an "evergreen" book as new others can contribute over time.
I can't be the only one that would immediately buy a copy. :D
I'm actually trying to work on a video-series to do just this. I've made my own rudimentary plugins reproducing several popular ones, and would like to walk through how I made: a) file-tree b) picker/fzf replacment c) hop/leap replacement d) surround plugin e) code-formatter f) hydra (sub-modes) g) many "UI" (interactive) buffers, etc.
None of these are published because the popular ones are better and provide more functionality, but I want to share what I believe is more valuable: what I learned while writing them.
(I personally don’t use patches like this because “Lua 5.1” is something pretty standardized with a bunch of different implementations; e.g. I wrote my Lua book with a C# developer who was using the moonsharp Lua implementation)
LuaJIT is in somewhat active development, with 40 commits so far this year, although these are mostly bug fixes (some for bugs introduced by LLVM). The main new feature, if you can call it that, is "suport for Apple hardened runtime."
You say that as if Lua 5.3 and 5.4 were better than Lua 5.2 (which LuaJIT has support for most of the new features of) or 5.1, rather than merely newer. But programming languages don't decay like your teeth.
That happens with all Lua applications, because Lua has never aimed for backward compatibility from one version to the next, so applications basically never upgrade to a new version of Lua.
New applications using LuaJIT will continue to be on Lua 5.1. And applications that do upgrade their dependencies but use LuaJIT are going to be stuck on 5.1 maybe forever, too.
Yes, and there's nothing wrong with that. It doesn't result in the same degree of fragmentation in Lua as it did in the Python 2/3 split, because it's both socially accepted and usually technically easy to write code that works in both Lua 5.1 and Lua 5.4 and everything in between.
Except they are? Lua 5.3 bitwise literals are a big improvement over the builtins bitand32, bitor.. the addition of integer type makes the language more suitable for embedded systems that don't have FPU.. Lua 5.4 added const attribute for local variables..
Those are tradeoffs, and I'm not convinced they're good ones. Suitability for processors that are so small they don't have an FPU isn't relevant to LuaJIT in any case, but you've been able to compile Lua with integer numbers since 1.0.
I spent a few nights trying to implement a Lua interpreter myself and it was still like 10x slower than PUC Lua, even before adding a GC. I'm not sure how they do it, it looks like regular C code
LuaJIT is amazing. I find it insane there are no Schemes that are able to match it as a tiny, embeddable scripting language with a JITer. GNU Guile is an absolute gargantuan monster in comparison.
When comparing speed I use simple tests like a loop printing an incremented line number or reading from stdin and printing to stdout. These simple tests are useful for me because, when combined with pattern matching or regular expressions, simple I/O tasks like these are actually what I use a "memory safe" language for
dino is slightly faster than lua (not luajit)
but spitbol is actually faster than lua, dino and luajit
ngn k is slightly faster than spitbol but lacks built-in pattern matching or RE
FWIW, I do not use a terminal emulator. I only use textmode No graphics layer. No "desktop"
> These simple tests are useful for me because, when combined with pattern matching or regular expressions, small, simple I/O tasks like these are actually what I use a "memory safe" language for
Are you sure you aren't being bottlenecked by your terminal?
For many, Lua is primarily known as the Roblox language. Pretty impressive that it's the language of use in a game (/set of games) with 380 million monthly active players - currently the most popular in the world.
Lapis is very cool, but I really struggle with a lack of examples and pre-built solutions for some things.
Leafo's itch.io is built with it and I maintain snap.berkeley.edu. A great tool, but I'm also many times more productive in Rails. (It's an unfair comparison, to be sure!) and Openresty + Lapis is quite performant and low overhead which is great.
I see your point. I've just craved a rails-like experience in Lua for a while and don't believe there's anything out there yet built on Lua that can match with the big boys (rails, .NET, etc.)
A useful design pattern is to write highly efficient "engine" code in C/C++ and then tie it together to highly customizable application code with an embedded scripting language.
Lua is great for this, and while you could use a LISP (Emacs, AutoCad) or a FORTH (any real-life example not from the radio telescope domain?) or Tcl/Tk (https://wiki.tcl-lang.org/page/Who+Uses+Tcl), Lua is small (as in number of concepts to understand), easy to read and understand, and compact (as in tiny codebase to integrate, few dependencies).
So many gaming startups have put their money on Lua.
The Lua-C API is also really consistent and straightforward. Bindings can be generated mechanically, of course, but it's really easy to embed by hand, and the documentation is superb.
Here’s my bit of public domain code for iterating through tables in Lua so that the elements are sorted. This routine works like the pairs() function included with Lua:
-- Like pairs() but sorted
function sPairs(inTable, sFunc)
if not sFunc then
sFunc = function(a, b)
local ta = type(a)
local tb = type(b)
if(ta == tb)
then return a < b
end
return ta < tb
end
end
local keyList = {}
local index = 1
for k,_ in pairs(inTable) do
table.insert(keyList,k)
end
table.sort(keyList, sFunc)
return function()
key = keyList[index]
index = index + 1
return key, inTable[key]
end
end
Example usage of the above function:
a={z=1,y=2,c=3,w=4}
for k,v in sPairs(a) do
print(k,v)
end
With a sort function:
a={z=1,y=2,c=3,w=4}
function revS(a,b)
return a>b
end
for k,v in sPairs(a,revS) do
print(k,v)
end
(Yes, this is a lot easier to do in Perl or Python, since those languages unlike Lua have built in list iterators, but it’s possible to do in Lua too)
If you did not return a closure, you could set the metatables __pairs to use your function's. Sadly, you could not do this without keeping some sort of cache, which would be a terrible waste of memory, but then again you're already creating an iterator that iterates completely, to me, this beats the use of an iterator (to not iterate over everything and break on a single pass)
Its a very flexible way to do realtime/embedded, performance-critical services (e.g., a game server or API gateway) where Lua's speed and low overhead matter.
"Please don't post insinuations about astroturfing, shilling, bots, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data."
I was thinking a while back, how nice it would be if lua was the scripting language in the browser instead of javascript. There are some projects to compile lua to wasm and have it run in the browser...
https://pluto-lang.org/web/#env=lua%3A5.4.6&code=if%20_PVERS...
But interoperability with the DOM is the missing key.
Still, if lua was used instead of javascript, I could see myself saying... man, I wonder what browser development would be like if we replaced lua with x.
What specifically do you think would be better? Lua shares many of JS's quirks (like the relationship between arrays and non-array objects, the behavior of undefined for non-existent object properties, metatables are somewhat similar to JS prototypes, etc.) and adds a bunch more (lack of continue statement, 1-indexing, cannot have nil values in tables).
I can see people liking or disliking Lua and JS both, depending on taste, but it's hard to see someone liking one and disliking the other.
Lua has tail call optimization and js doesn't. For me, this is a dealbreaker for js.
Lua also has operational advantages compared to javascript. You can build it from source in at most a few seconds and run it anywhere that has a c compiler. The startup time is negligible even compared to a compiled c program so you can run entire scripts that do very useful things faster than most js engines can print hello world. There is also the benefit that in spite of what this article discusses, it is possible for most problems to write a lua solution that works for all lua >= 5.1. This is wonderful because it means if I stick to the standard library, the bitrot is essentially zero.
Calling into c is very straightforward in Lua, especially with luajit, which makes it superior to js as a scripting language (the node ffi api is quite painful in my experience and ffi runs against the js execution model).
Lua also essentially got lexical scoping of local variables correct from the beginning while js blessed us with the nightmare of var.
Of course Lua is far from perfect, but while there are some similarities, the differences are quite significant and meaningful.
> Lua also essentially got lexical scoping of local variables correct from the beginning while js blessed us with the nightmare of var.
That was not my experience when I was working with lua. Did anything change since? Asked google. Answered :
> In Lua, if a variable is assigned a value inside a function without being explicitly declared with the local keyword, it will automatically become a global variable. This is because, by default, all variables in Lua are global unless explicitly specified as local.
Yeah, this puzzled me too. I'm assuming they're referring to the semantics of "var" in JS vs "local" in Lua, with the latter resembling "let" in JS, which doesn't have broken scoping.
> Calling into c is very straightforward in Lua, especially with luajit, which makes it superior to js as a scripting language (the node ffi api is quite painful in my experience and ffi runs against the js execution model).
As a scripting language for browsers this is an antifeature, and the fact Lua comes by default with a bunch of features that allow loading arbitrary binary code in the process makes it pretty annoying to properly use it as a sandboxed scripting language.
You're conflating a few things here. V8 and Spidermonkey aren't the only interpreters out there. There are a number of them explicitly designed to be small, easy to compile and to embed, such MuJS, Fabrice Bellard's QuickJS, and more still. I can't speak to their FFI interfaces, but you can't judge JS's ability to call C functions based off that of Node/V8. I'm not sure how FFI runs against its execution model given JS is generally used as an embedded language, necessitating foreign function calls.
Why is tail call optimization a dealbreaker for you? That's very specific...
Guaranteed TCO makes it practical to code recursive algorithms in natural style.
IMHO, Tail call optimization makes sense only when it's enforced (like in clojure), otherwise it's wild.
I think a lot of people are missing this part, weighing things like 1 based indexing and curly brackets.
All of that is trivial when you consider the Lua reference implementation. It is beautiful.
If I remember correctly, TCO is now part of the ECMAScript standard. Safari has implemented it. The issue is that others engines have not because they are concerned about stack unwinding and stacktraces.
In theory JavaScript also has it on its standard, unfortunately that is yet another thing that browser vendors don't agree on.
I agree mostly in that Lua and Javascript are both similar, and like I said in my post above, I could see myself saying the exact opposite if Lua had been included in the browser.
The things I do not like about Javascript can easily be shot down in an argument. Some of it was having to work with Javascript (and it's evil cousin JScript) in the 90s and early 00s.
The type coercion and in the early days people used '=='. I think === did not even appear until ie6?
[] == ![] // true
The lack of a lot of easy helper functions in the standard lib. That now are provided by people writing a bunch of stuff in the npm ecosystem.
The npm ecosystem itself. Lack of security. Lack of... curation? (Also, all this would have probably happened anyway if Lua was in the browser)
I also think javascripts long history has created a legacy of different paradigms
variable declaration var, let, const
function declaration
function f1() {}
const f2 = function() {};
const f3 = () => {};
const obj = { f4() {} };
There is a lot of stuff like this in javascript. I could probably make a blog post about it. But the above gives the general idea of my complaints.
In practice you don't run into these issues often. I'm annoyed when you see different function declaration conventions in the same codebase, but generally () => is used for either one line functions or inline lambdas, and function foo(){} for everything else. Nobody uses var anymore.
The implicit conversions is a definite footgun tho.
Funny you mention nobody uses var anymore when I just saw a post on here yesterday that perf critical code still uses var since it's faster
Bundlers will convert let/const to var, assign classes and functions to var etc but generally people don't write it themselves unless they want to (ab)use its semantics for performance reasons.
Do people often use bundlers for the backend?
For me, JS has just too much magic, in particular the behavior of 'this', and lots of weird quirks, like "for in" vs. "for of". Lua, on the contrary, is very predictable and therfore easy to understand.
One killer feature of Lua (that surprisingly few scripting languages have) is stackful coroutines, i.e. you can yield across nested stack frames. Unlike JS or Python, there is no artificial split between generators and async/await and no need for function coloring.
If Lua had zero-based indexing, it would be close to perfect :)
Have you read large Lua codebases written by others? It is write-only language. All your "very predictable" features are overloadable, at runtime. No static typing to rest your eye on. It is a swamp.
"But, just write good code" you will say. Just like with Perl, some languages are designed in a way to discourage writing good code...
I'm talking about the core language, which I do find very predictable. You can go crazy with any language. Lua is definitely not worse than Python or JS in this regard.
> No static typing to rest your eye on.
That goes for any dynamically typed language. How is that an argument against Lua in particular?
> Have you read large Lua codebases written by others?
No, because I use it as a scripting language, as intended. I totally agree that one shouldn't use dynamically typed languages for building large applications. But again, this is not specific to Lua.
Oh, as a scripting language you embed into your project so that you can write scripts for it — there is hardly anything better than Lua. The C code is super clean and easy to embed and modify.
But once that project gets passed to next maintainer — I'm not sure I'd pick Lua over Forth or Scheme.
I would pick Tcl instead, but I am biased. :)
I occasionally have to write Tcl. No thanks :)
I wrote it during four years as main language (1999 - 2003), alongside C.
Yes, I was going to write this comment a few hours ago but never got around to it. Working on other peoples Lua can be very painful.
Even the fact that people really want to write object oriented code, but every project rolls its own class system is a problem.
When I write lua is just tables of data and functions. I try to keep it as simple as possible.
I've been enjoying writing games for the Playdate, and in Love2d.
if Lua had zero-based indexing it won't be Lua. also removing '.length - 1' globally will reduce gas emissions by 1% worldwide (my guess)
I held a similar opinion several years ago. The main thing is that lua has less magic than js largely because it's been allowed to break compatibility.
My main example is self in lua which is just the first argument of a function with some syntactic sugar vs this in javascript which especially before 'bind' often tripped people up. The coercion rules are also simpler largely by virtue of making 0 true.
Lua has goto instead of continue.
Lua's metatables also cover a fair bit more ground than JS. For example, indexing, math operations (including bitwise), comparisons and equality can be overriden.
Some of thr language features like operators can be overloaded using method tables, much like python's double underscore methods. I do think that the inability to do this in JS is what held it back from becoming a data processing powerhouse. But it perhaps can be made to run fast precisely because of the lack of such overrides (otherwise we'll be running python at JS speeds). That said, LuaJIT seems like a good compromise and might be best of both worlds.
I love JS and Ruby, and I spent a bit of time maintaining a web app in Lua.
There are parts about the language I really enjoy. I like dynamic languages; I don't run into type issues so often it bothers me, but the tooling around Lua still leaves me wishing for me.
Part of this is really my own ability, or lack thereof, but I feel slow and constrained in lua. I have a hard time debugging, but I don't feel these things in other languages.
I like lua better because its minimalist. Javascript feels like a kitchen sink language.
Otoh, missing ++, +=, ..= operators really bothers me.
But that's just personal taste, not objective by any means.
Yeah. I think if I were to pick one reason, it would be Lua is "minimalist". But, like I wrote, maybe, it too, would have just had lots of stuff added onto it if it had been in the browser. Hard to say.
I love ++, but you know what? I was shocked when a co-worker pointed out that it is frowned upon to use ++ in javascript. It some big companies, their linter settings mark ++ as something that should be changed.
https://eslint.org/docs/latest/rules/no-plusplus
Lol, i suppose there is no accounting for taste, but the justification for that rule is really something.
Appearently its really confusing if you put a bunch of newlines between the ++ operator and its ophand. No kidding.
The origin of this rule was that Douglas Crockford doesn't like ++.
From JavaScript: The Good Parts (May 2008):
> The ++ (increment) and -- (decrement) operators have been known to contribute to bad code by encouraging excessive trickiness. They are second only to faulty architecture in enabling viruses and other security menaces. The JSLint option plusplus allows the use of these operators.
Needless to say, I'm with you and hackthemack on this.
Nil is better than how undefined works. It's not just as bad and then more bad on top.
There’s also a Javascript implementation of Lua which allows one to run Lua in a browser:
https://github.com/fengari-lua/fengari-web
I don’t know if this can access the DOM in Lua, but considering that Fengari is in Javascript, adding a _DOM global variable should not be too hard (if it hasn’t already been done).
No need for _DOM.
Fengari lets you access the DOM. Its pretty cool.
IIRC Brendan Eich has talked about this - if he’d adopted Lua in 1995 instead of creating JavaScript, it wouldn’t have been Lua 5.x but Lua 2.x.
Lua has improved substantially from version to version because it’s been able to break compatibility. That wouldn’t be possible in the browser, so today we’d still be stuck using (an augmented version of) the outdated Lua 2.x.
Yes, he did:
https://web.archive.org/web/20191024193930/https://twitter.c...
“Lua in 1995 was very different, no coros e.g., and no one would be happy if it flash-froze and then slow-forked on a different path from Lua's. See https://news.ycombinator.com/item?id=1905155 and yes, wasm is the right long-term plan. In 1995 it was supposed to be Java, but that didn't pan out!”
Fengari has DOM interop.
http://fengari.io/
It is a Lua reimplementation in JS.
There is also nelua (https://nelua.io/) which can use WASM to compile allow its usage in the browser: https://github.com/edubart/nelua-game2048/
It is interesting that so many of the largest languages were developed in a couple year time frame in the early-mid 90s. Python, Javascript, Java, Lua, R. All of these were developed 91-95 and make a bulk of development today.
It is an interesting observation!
I'm sure there are a few unrelated factors going on to influence that plus some inadvertent cherry-picking. But I do think there is a thing to the observation too.
If I had to make a guess, I'd point to a combination of:
1. This was right around the time that computers were fast enough to afford the runtime cost of garbage collection while still delivering usable apps. The GC languages before the 90s (Lisp, Scheme, Smalltalk, Self) had reputations for abysmal performance and have largely (but not entirely) died out.
2. This was also the boom of the Internet and web which shifted a lot of computation to server-side after the PC boom had shifted it client-side. That also enabled somewhat slower languages with GC to thrive because first-mover advantage was more valuable than using your servers efficiently. You could just throw more hardware at it.
3. That boom also created a huge influx of new developers who picked up whatever language happened to be hot at the time. And once they had, they tended to stick with it. I can't find a good graph of the number of software engineers over time but I'd bet that there's a noticeable uptick around the dotcom boom.
After mulling it over a bit, plus seeing a few other response (Php, Ruby, etc) I think the internet is the reason.
C, C++, Ada, Basic, Fortran. Those were written assuming they were going to be writing systems code and numerical code. There is then some inertia keeping the most successful ones of from those periods going in their spaces.
Then these new languages with new ideas (GCs, Dynamic Typing, etc) they came out and became successful in the newer Web and Application spaces. Now why they won out and not LISP or SmallTalk or what have you, I am not sure. But my hypothesis is that the web is a big part of it.
Languages developed in that time matured just as good binary package managers started popping up, is my pet theory. Before that, getting a development environment for a new language was serious work, and people did things like stick to Perl since it happened to be installed already.
Not true.
I was programming in the 90s when these languages emerged. Developments environments were emacs, vi, Brief, Borland IDE, etc. There were a few other IDEs available, but about $200 per seat.
All the scripting languages mentioned didn't come as default in Unix or Windows. You had to download from their own websites.
It was mostly Visual Basic, C, COBOL that were popular.
I think that's what I mean. After the time you talk about ('90s), these languages matured, and they happened to mature around the same time binary package managers became a thing, i.e. in the early-to-mid '00s.
There was also ELK Scheme, the Extension Language Kit, a scheme interpreter designed to be used as an extension language for other appllications.
https://www.usenix.org/legacy/publications/compsystems/1994/...
>Elk, the Extension Language Kit, is a Scheme implementation that is intended to be used as a general, reusable extension language subsystem for integration into existing and future applications. Applications can define their own Scheme data types and primitives, providing for a tightly-knit integration of the CIC++ parts of the application with Scheme code. Library interfaces, for example to the UNIX operating system and to various X window system libraries, show the effectiveness of this approach. Several features of Elk such as dynamic loading of object files and freezing of fully customized applications into executables (implemented for those UNIX environments where it was feasible) increase its usability as the backbone of a complex application. Elk has been used in this way for seven years within a locally-developed ODA-based multimedia document editor; it has been used in numerous other projects after it could be made freely available five years ago.
Also Gnu Guile:
https://en.wikipedia.org/wiki/GNU_Guile
>GNU Ubiquitous Intelligent Language for Extensions[3] (GNU Guile) is the preferred extension language system for the GNU Project[4] and features an implementation of the programming language Scheme. Its first version was released in 1993.[1] In addition to large parts of Scheme standards, Guile Scheme includes modularized extensions for many different programming tasks.[5][6]
Also Winterp, which used XLisp:
https://dl.acm.org/doi/10.1145/121994.121998
>Winterp is an interactive, language-based user-interface and application-construction environment enabling rapid prototyping of applications with graphical user interfaces based on the OSF/Motif UI Toolkit. Winterp also serves as a customization environment for delivered applications by providing a real programming language as an extension language. Many existing user-interface languages only have the expressive power to describe static layout of user interface forms; by using a high-level language for extensions and prototyping, Winterp also handles the dynamic aspects of UI presentation, e.g. the use of direct manipulation, browsers, and dialog. Winterp makes rapid prototyping possible because its language is based on an interpreter, thereby enabling interactive construction of application functionality and giving immediate feedback on incremental changes.Winterp's language is based on David Betz's public domain Xlisp interpreter which features a subset of Common Lisp's functionality. The language is extensible, permitting new Lisp primitives to be added in the C language and allowing hybrid implementations constructed from interpreted Lisp and compiled C. Hybrid implementation gives Winterp-based applications the successful extension and rapid-prototyping capabilities of Lisp-based environments, while delivering the multiprocessing perfor- mance of C applications running on personal Unix workstations.
And TCL/Tk of course!
https://www.tcl-lang.org/
And on the commercial side, there was Visix Galaxy, which was extensible in PostScript, inspired by NeWS:
https://www.ambiencia.com/products.php
https://0-hr.com/Wolfe/Programming/Visix.htm
https://groups.google.com/g/comp.lang.java.programmer/c/LPkz...
https://donhopkins.com/home/interval/pluggers/galaxy.html
https://wiki.c2.com/?SpringsAndStruts
>The Visix Galaxy project was a ridiculously overpriced and overfeatured portable GraphicalUserInterface. You could do things like swivel an entire panel full of their custom widgets 32 degrees clockwise, and it would render all its text at this new angle without jaggies. The company went out of business after gaining only a handful of customers. For USD$ 10,000 a seat they sure didn't see the OpenSource movement coming. Their last attempt before going under was (guess what?) a Java IDE.
Galaxy competed with Neuron Data Systems in the "cross platform gui framework" space (which got steamrolled by the web permanently and for a window of time Java):
https://donhopkins.com/home/interval/pluggers/neuron.html
Here is a great overview of User Interface Software and Tools by Brad Myers:
https://www.cs.cmu.edu/~bam/uicourse/2001spring/lecture05too...
https://www.cs.cmu.edu/~bam/toolnames/
https://docs.google.com/document/d/1hQbMwK_iyjX-wpu_Xw_H-3zL...
Java 1996, C++ only got standardized in 1998, C in 1990 (tehcnically the standard is from 1989, but there was a short retification in 1990), Delphi is from 1995 (not that big player nowadays, but plenty of its influences live on C#, Typescript and Kotlin).
It goes to show how much investment is required for a programming language to actually take off at scale.
However in a couple of years, we will be asking the computers to perform tasks for us and the actually compiler frontend will be irrelevant to the AI runtime.
Don't forget Ruby in 1995!
PHP too.
I think that's an illusion.
The language of R is S, which originated at Bell Labs in 01976. Python began development in 01989, although Guido didn't release it until 01991. And the top 20 on https://www.tiobe.com/tiobe-index/ are Python, C (01972?), C++ (01982?), Java, C# (01999? though arguably it's just a dialect of Java), JS, Visual Basic (first released 01991, within your window), Golang (02007), Delphi (under this name in 01995 but a dialect of Object Pascal from 01986, in turn a dialect of Pascal, from 01970), SQL (01973), Fortran (01957), Perl (01987), R, PHP (01995, within your window!), assembly (01947), Rust (02006), MATLAB/Octave (01984), Scratch (! 02003), Ada (01978?), and Kotlin (02011).
By decade, that's one language from the 40s, one language from the 50s, no languages from the 60s, 5 languages from the 70s, 5 languages from the 80s, 4 languages from the 90s, 3 languages from 0200x, one language from the 02010s, and no languages from the 02020s.
Lua is #33 on TIOBE's list, but given its prevalence in Roblox (as Luau), WoW, and other games, I suspect it should be much higher.
For some reason, CUDA (a dialect of C++) and shader languages like GLSL don't show up in the list at all.
— ⁂ —
I think most of what's going on here is that it takes a new language a long time to get good, and it takes a new good language a long time to get popular. Perl, Python, Java, PHP, and JS became popular because of the Web; https://philip.greenspun.com/panda/server-programming explains why Perl, Python, and PHP did, and of course Java and JS became popular because they were the only languages you could make interactive web pages in:
> You would think that picking a Web site development language would be trivial. Obviously the best languages are safe and incorporate powerful object systems. So let's do everything in Common Lisp or Java. Common Lisp can run interpeted as well as compiled, which makes it a more efficient language for developers. So Common Lisp should be the obvious winner of the Web server language wars. Yet nobody uses Common Lisp for server-side scripting. Is that because Java-the-hype-king has crushed it? No. In fact, to a first approximation, nobody uses Java for server-side scripting. Almost everyone is using simple interpreted languages such as Visual Basic, PHP, Perl, or Tcl.
> How could a lame string-oriented scripting language possibly compete in power with systems programming languages? Well, guess what? The only data type that you can write to a Web browser is a string. And all the information from the relational database management system on which you are relying comes back to to the Web server program as strings. So maybe it doesn't matter whether your scripting language has an enfeebled type system.
Why do you write your years with a leading zero?
Some people think that writing years as 2025 is wrong because this will lead to problems in year 9999 (y10k bug? I'm not sure if they call it that way) so they decided to introduce leading zero as it would solve something and not just postpone the problem to 99999.
So they are assuming that:
- this comment will still be around in 8000 years
- we will still be using the same calendar system in 8000 years
- people 8000 years in the future will leave off the leading 1 of years for some reason, and will use a leading 0 to disambiguate dates from the previous 10000 year period.
I would say it is a symbolic reminder to care about the long term consequences of our actions. In the same way we have holidays to remind us about the environment or mortality.
Somehow both incredibly optimistic and also unbelievably resigned at the same time
That's silly. The y2k bug was because the year was written as 65, instead of the full year being 1965, so information was lost. Writing 2025 has no missing information.
So you can instantly recognize at a glance that it's Kragen's post! ;)
It's a Long Now Foundation thing: slower, deeper, longer. Y10K compliance.
https://longnow.org/ideas/long-now-years-five-digit-dates-an...
"The present moment used to be the unimaginable future." -Steward Brand
"How can we invest in a future we know is structurally incapable of keeping faith with its past? The digital industries must shift from being the main source of society’s ever-shortening attention span to becoming a reliable guarantor of long-term perspective. We’ll know that shift has happened when programmers begin to anticipate the Year 10,000 Problem, and assign five digits instead of four to year dates. 01998 they’ll write, at first frivolously, then seriously." -Steward Brand
10,000 Year Clock:
https://longnow.org/clock/
I wonder if somebody offers a therapy for that.
I do agree with your point but I also think that there is a lot of inertia in the sector (rightfully so!) and it is very difficult for languages to become established if they don't come with a "unique selling point" of sorts, which to me explains how new popular languages have become rarer.
That selling point, for Lua, is the super easy integration via C-API to me (=> making existing compiled applications scriptable), thanks to uncomplicated build (dependency free, simple), the straightforward C-API and the ease of exposing C "modules" to Lua.
On a sidenote:
Don't you think that Y10k-safe dates are somewhat inconsistent with referencing previous decades directly? Those dates are also obnoxious to parse for humans (myself, at least).
>C# (01999? though arguably it's just a dialect of Java)
That's like saying Java is a dialect of C++. Java was specifically designed as a "fuck you" to C++, and C# was specifically designed as a "fuck you" to Java.
While at a political level that's reasonable, at both the semantic and the syntactic level, the first version of C# was very close to Java, much closer than the first version of Java was to C++. https://learn.microsoft.com/en-us/dotnet/csharp/whats-new/cs... is a very vague overview.
More like a better Objective-C, and with a syntax that was appealing to C++ developers.
https://cs.gmu.edu/~sean/stuff/java-objc.html
.NET was being designed with J++, Microsoft's Java extensions, Cool research language only became C# and took over J++'s role in .NET due to Sun's lawsuit.
The lawsuit is more than well known, and the background to .NET planned used of .NET is on the papers published by Don Syme of F# fame, regarding the history of .NET and F# HOPL.
Regarding C# and Java part of your comment, I think you might want to take a look at the following Wikipedia entries:
- Microsoft Java Virtual Machine: https://en.wikipedia.org/wiki/Microsoft_Java_Virtual_Machine
- Visual J++: https://en.wikipedia.org/wiki/Visual_J%2B%2B
I've known and worked with James Gosling for years before Java (Live Oak), on his earlier projects, Emacs at UniPress and NeWS at Sun, and fought along side him against Sun management trying to make NeWS free in 1990 (and I left Sun because they broke the promises they made us and spilled a lot of blood), so I didn't need to learn about Java's history from Wikipedia.
James's email that convinced me to go work with him at Sun on NeWS in 1990:
https://news.ycombinator.com/item?id=22457490
James' original 1985 paper on SunDew (later called NeWS):
https://www.chilton-computing.org.uk/inf/literature/books/wm...
David Rosenthal on NeWS -vs- X11 in 2024:
https://www.theregister.com/2024/07/10/dshr_on_news_vs_x/
James Gosling on how he'd do it over again in 2002:
https://web.archive.org/web/20240126041327/https://hack.org/...
Me on the X-Windows Disaster, comparing X11 and NeWS in the 1994 Unix Haters Handbook:
https://donhopkins.medium.com/the-x-windows-disaster-128d398...
Here's a Stanford talk James Gosling gave about Java that I attended in 1995, where he talks about C++, his original tape copy program that turned into a satellite ground control system, how he holds the world record for writing the largest number of cheesy little extension languages to go, and his implementation of Emacs sold by UniPress (which RMS calls "Evil Software Hoarder Emacs"), and his design and implementation of NeWS (formerly SunDew), a PostScript based network extensible window system.
James Gosling - Sun Microsystems - Bringing Behavior to the Internet - 1995-12-1:
https://www.youtube.com/watch?v=dgrNeyuwA8k
>Video of James Gosling's historic talk about Java, "Bringing Behavior to the Internet", presented to Terry Winograd's user interface class at Stanford University, December 1, 1995.
In that talk I asked him a couple questions about security and the "optical illusion attack" that he hedged on (44:53, 1:00:35). (The optical illusion attack is when the attacker simply draws a picture of a "secure" pop up dialog from your bank asking for your password.)
He mentioned off hand how a lot of the command and control systems for Operation Desert Storm was written in PostScript. That was his NeWS dialect of PostScript, and was written primarily by Josh Siegel at LANL called "LGATE", who later came to work at Sun in 1990 and rewrote the NeWS PostScript interpreter himself, then went on to write an X11 window manager in PostScript, again proving James's point that people always did a lot more with his cheesy little extension languages than he ever expected (which also held true with Java).
Josh's work on simulating Desert Storm and WWIII with NeWS at LANL:
https://news.ycombinator.com/item?id=44540509
Some of Terry Winnograd's other guest speakers:
https://news.ycombinator.com/item?id=39252103
I also saw Bill Joy's much earlier talk at the 1986 Sun Users Group in Washington DC, where he announced a hypothetical language he wanted to build called "C++++-=", and that he talked about in subsequent presentations.
I think that was the same talk when Bill said "You can't prove anything about a program written in C or FORTRAN. It's really just Peek and Poke with some syntactic sugar". More Bill Joy quotes:
https://www.donhopkins.com/home/catalog/unix-haters/slowlari...
James eventually realized that concept as Java, showing that the kernel inspiration of writing a "fuck you to C++" language existed long before James invented "Live Oak", even soon after C++ was invented. But "Java" was a much better name than "Live Oak" or "C++++-=" fortunately -- thanks to Kim Polese -- though not as succinct and musically inspired as "C#"!
https://en.wikipedia.org/wiki/Bill_Joy#Joy's_law
https://news.ycombinator.com/item?id=30113944
Bill Joy’s Law: 2^(Year-1984) Million Instructions per Second
https://donhopkins.medium.com/bill-joys-law-2-year-1984-mill...
>The peak computer speed doubles each year and thus is given by a simple function of time. Specifically, S = 2^(Year-1984), in which S is the peak computer speed attained during each year, expressed in MIPS. -Wikipedia, Joy’s law (computing)
>“C++++-= is the new language that is a little more than C++ and a lot less.” -Bill Joy
>In this talk from 1991, Bill Joy predicts a new hypothetical language that he calls “C++++-=”, which adds some things to C++, and takes away some other things.
>“Java is C++ without the guns, knives, and clubs.” -James Gosling
I think alasr meant to suggest that you might learn more about the history of C# by reading Wikipedia, not about the history of Java.
[flagged]
[flagged]
True
> work has begun on Lua 5.5
A beta version is now available:
https://www.lua.org/work/
The main change from 5.4 seems to be the (optional?) removal of global-by-default, instead requiring declarations for global variables.
As it turns out, it’s possible with Lua going back to 5.1 to not allow undeclared global variables:
https://www.lua.org/pil/14.2.html
That code looks like this in Lua 5.1:
Then, to useNaturally, the main catch is that this only detects the violation at run-time. It also won't stop you from accidentally overwriting a global variable that already exists.
> It also won't stop you from accidentally overwriting a global variable that already exists.
That's also solvable using metatables.
Do the declarations in 5.5 prevent overwriting? That sounds tricky to define and implement.
Lua is the SQLite of program languages, absolutely blast
While I enjoy Lua, clean, elegant, and entirely too reasonable, Tcl is undoubtedly the SQLite of programming languages.
https://www.tcl-lang.org/community/tcl2017/assets/talk93/Pap...
It's because Tcl, like SQLite, operates on a peculiar metaphysical principle: everything is a string until proven otherwise, and even then, it's probably still a string.
Also, D. Richard Happ, who we owe thanks for SQLite, was and perhaps still sits on the TCL Board (I may be wrong about that, but Happ holds significance in the TCL community).
In my mind:
Tcl is the quietly supportive roommate who keeps making coffee and feeding LISP-like functionality until the world finally notices its genius.
Lua sits across the table, sipping espresso with a faintly amused expression, wondering how everyone got so emotionally entangled with their configuration files.
Lua is one of the easiest configuration file formats I've had the pleasure of working with. Readable. Has comments. Variables. Conditionals.
Everyone (including me): "oh no, no, you don't want a full Turing complete language in your configuration file format"
Also Everyone: generating their configuration files with every bespoke templating language dreamed of by gods and men, with other Turing complete languages.
It's a security issue. Configs are user interfaces. Devs generating configs is irrelevant.
Indeed - it would depend greatly one's workflow and threat model.
You could solve this with a capabilities permissions system. That way the config files can be written in the same language but have configured permissions that are different from the rest of the programming language. So you could restrict the config files from resources like threads, evaling source, making network requests and whatnot. Come to think of it you could even probably section off parts of the language behind capabilties such that the config files could be configured to be a not-Turing complete subset of the language.
Lua started as a config language.
*Hipp
Didn't the string stuff get improved in the last few years to be a lot more performant?
TCL 9 brought some welcome string improvements, and things run faster overall. But in my case, it's hard to say how well that's actually played out, partly because I haven't done the work to find out. My TCL scripts and apps work well enough to allow me to be lazy about them.
Performance is up, but so is my inertia. So while TCL 9 could be transformative, for now it remains a white paper I've skimmed, not a revolution I've implemented.
I think TCL does an opaque thing, everything "is" a string, but if you don't use it as a string, it's actually stored in some optimized format. Then it converts back to a string on demand
I still prefer Lua personally. Their type system is easy for me to understand
Lua is the glue when sh/bash doesn't suffice.
Lua is simple and elegant, and I much prefer it to Tcl.
Lua is in games and in LuaTeX, and when you have the choice of embedding a LISP, a FORTH or Lua in a larger application, it is often the most maintainable, runtime-efficient and low-memory footprint option of all.
> Lua is the glue when sh/bash doesn't suffice.
I enjoy Lua and use it professionally, but when bash (and AWK) don't suffice, the glue is Perl. Because it has pipes which you can use to connect the output of one command to the input of another, or to a file.
People don't normally embed FORTH in a larger application.
If you embed Lua you also get Fennel (almost LISP) support for free.
What would be the SQLite's equivalent to indexing starting from 1, not 0? Off the top of my head I can't think of anything that would go so much against the grain.
For me it's case insensitive LIKE.
column types are more like guidelines than rules
DuckDB is the mother of them all. It is waaaay more capable than SQLite, and embeds SQLite for those who need 100% compatibility for some workloads.
nice LoL
I don't understand why Python is so popular when there's Lua. It's just so much better. Not as good as Rebol but still much better than Python.
Python has a much larger and more mature ecosystem. Lua barely has a functioning package manager.
It's still early in development, but the new package manager Lux looks promising: https://github.com/lumen-oss/lux
Don't forget it's much more productive writing a script in Python to parse a text than Lua due to its huge feature-rich syntax
Lua is great when you need an embedded language.
That's it. It's not great on its own, in my opinion.
It becomes really gnarly if you have a bigger Lua codebase.
Rebol is the cleanest/greatest language I've read code for but the VM is the slowest VM I've ever wrote code for, mind you I only did the first 10 exercises of euler but the only thing that has it beat is writing a shellscript that forks to dc/bc on each math expression.
My only context in using Lua is my neovim configuration, does any know of any good books or tutorials that make something more advance using only lua? Anything of note to consider/read/watch?
Shameless plug time: I have written a public domain book which looks at using Lua (Lua 5.1 but the code works in newer versions as well as Lua 5.1/Luau/LuaJIT) as a text parsing engine. The book assumes familiarity with other common *NIX scripting languages (such as AWK, Perl, or Python) and goes over in detail the pain points for people coming from a *NIX scripting background:
https://maradns.samiam.org/lunacy/SamDiscussesLunacy.pdf
Source files (.odt file, fonts used by book):
https://github.com/samboy/lunacy/tree/master/doc
Hope this helps!
Thanks for recommendation and writing the book! I'm reading the introduction, it doesn't answer why you chose to fork lua and create lunacy. What were you trying to solve with lunacy that lua couldn't do?
The TOC looks great, I will read this soon. Need to finish "Debugging CSS" first (another good book IMO).
The main reason I made Lunacy was to have a standard compile of Lua 5.1, since it’s possible to make a Lua 5.1 compile with, say 32-bit floats or which only supports integers but not floats.
Lunacy also has a few built in libraries which are not included with Lua 5.1, such as binary bitwise operations (and/or/xor). It also fixes some security issues with Lua 5.1 (better random number generator, hash compression algorithm which is protected from hash flooding attacks).
In addition, I have made a tiny Windows32 binary of Lunacy.
Don’t worry about the Lunacy changes; all of the examples in the book work with bog standard Lua 5.1 with a bit32 library (bit32 is common enough most OSes with a Lua 5.1 package also have a bit32 package for Lua 5.1).
Don't have tutorials or books, but I've had a ton of fun using Lua with LOVE2D [0] for gamedev, and also Redbean [1] for building super small portable web applications. Earlier this year, I built a mini CMS [2] inspired by rwtxt with Redbean.
[0] https://love2d.org/
[1] https://redbean.dev
[2] https://github.com/kevinfiol/beancms
Haven't heard of love2d but have heard of pico-8. I've avoided most game dev tutorials because they seem overly focused on beginners, which is fine, but want to find more advance materials that assume the reader knows some basics.
Maybe I should reconsider and dive more into game dev.
This series of tutorials might be of interest: https://github.com/a327ex/blog/issues/30
> It's aimed at programmers who have some experience but are just starting out with game development, or game developers who already have some experience with other languages or frameworks but want to figure out Lua or LÖVE better.
Also on the topic of game engines with lua scripting, the wonderful Defold always deserves a mention https://defold.com/
Definitely include Roberto's Programming in Lua book in your list. Specially if you'd like to script Lua together with C. The book has a good primer on the Lua-C api in its latter half.
I always wrote Lua off, scoffing at the 1-based indexing, until I was "forced" to learn it thanks to Neovim. What a delightful little language it is. I do wish I could do certain things less verbosely (lambdas would be nice) -- but then again, I defeat myself by suggesting it, because not having all the features makes Lua so approachable.
I used Lua professionally. I prefer the 1 indexing... it just feels more natural. For some reason the C apologists here will scream how 0 based is the only way to go. (which is not, it is just a historical artifact). Languages like ADA allowed you to use either 0 or 1, (or any arbitrary) starting index.
Same here, in fact something I wish the neovim team would do is create a book where popular plugin authors create tutorials that recreate basic functionality of their plugins.
Seems like a no brainer that would help bring in more revenue too, it'd also be an "evergreen" book as new others can contribute over time.
I can't be the only one that would immediately buy a copy. :D
I'm actually trying to work on a video-series to do just this. I've made my own rudimentary plugins reproducing several popular ones, and would like to walk through how I made: a) file-tree b) picker/fzf replacment c) hop/leap replacement d) surround plugin e) code-formatter f) hydra (sub-modes) g) many "UI" (interactive) buffers, etc.
None of these are published because the popular ones are better and provide more functionality, but I want to share what I believe is more valuable: what I learned while writing them.
That sounds great! Do you have a youtube channel or something to follow when you release it?
Yep, though I'm still trying to hit my stride recording videos. I don't release regularly because of lots of amazing $life things.
https://www.youtube.com/@nocturing
If you want a sneak peak of what I want to walk through, check this repo (see the examples/ folder): https://github.com/jrop/u.nvim
Lua has lambdas. They too suffer from verbosity, of course, but they're there.
There are patches for this so the above can be expressed with something like this:
http://lua-users.org/files/wiki_insecure/power_patches/5.4/l...And for Lua 5.1:
http://lua-users.org/files/wiki_insecure/power_patches/5.1/l...
(I personally don’t use patches like this because “Lua 5.1” is something pretty standardized with a bunch of different implementations; e.g. I wrote my Lua book with a C# developer who was using the moonsharp Lua implementation)
That's what I meant and didn't communicate well. I'm wishing for short-form syntax of lambdas, to be clear.
Take a look at Roblox, their market cap is almost 100B, they developed Luau and their game engine runs on it.
- https://luau.org/ - https://luau.org/why
I absolutely love Lua and I especially love when others discover Lua and are about to embark on the amazing journey.
Here, you're gonna need this:
https://www.lua.org/gems/
Wish there is newer LuaJIT to leverage the new Lua features, but then maybe those new features are not really that critical.
LuaJIT is in somewhat active development, with 40 commits so far this year, although these are mostly bug fixes (some for bugs introduced by LLVM). The main new feature, if you can call it that, is "suport for Apple hardened runtime."
LuaJIT is also stuck at Lua 5.1 (by choice) while the latest is Lua 5.4, with 5.5 on the way.
You say that as if Lua 5.3 and 5.4 were better than Lua 5.2 (which LuaJIT has support for most of the new features of) or 5.1, rather than merely newer. But programming languages don't decay like your teeth.
They don't decay but it results in a split like Python 2 vs. 3 where some Lua code is not valid in applications using LuaJIT.
That happens with all Lua applications, because Lua has never aimed for backward compatibility from one version to the next, so applications basically never upgrade to a new version of Lua.
New applications using LuaJIT will continue to be on Lua 5.1. And applications that do upgrade their dependencies but use LuaJIT are going to be stuck on 5.1 maybe forever, too.
Yes, and there's nothing wrong with that. It doesn't result in the same degree of fragmentation in Lua as it did in the Python 2/3 split, because it's both socially accepted and usually technically easy to write code that works in both Lua 5.1 and Lua 5.4 and everything in between.
Except they are? Lua 5.3 bitwise literals are a big improvement over the builtins bitand32, bitor.. the addition of integer type makes the language more suitable for embedded systems that don't have FPU.. Lua 5.4 added const attribute for local variables..
Those are tradeoffs, and I'm not convinced they're good ones. Suitability for processors that are so small they don't have an FPU isn't relevant to LuaJIT in any case, but you've been able to compile Lua with integer numbers since 1.0.
I feel like the existence of LuaJIT made is seemed like standard Lua was slow, but that's far from the case.
I spent a few nights trying to implement a Lua interpreter myself and it was still like 10x slower than PUC Lua, even before adding a GC. I'm not sure how they do it, it looks like regular C code
LuaJIT is amazing. I find it insane there are no Schemes that are able to match it as a tiny, embeddable scripting language with a JITer. GNU Guile is an absolute gargantuan monster in comparison.
But you can use Fennel, that is a bit scheme-like if you squint a bit, and it transpiles to very nice Lua code in my experience.
Agree, especially because it'd be nice for projects that use LuaJIT if you could swap the versions as needed.
When comparing speed I use simple tests like a loop printing an incremented line number or reading from stdin and printing to stdout. These simple tests are useful for me because, when combined with pattern matching or regular expressions, simple I/O tasks like these are actually what I use a "memory safe" language for
dino is slightly faster than lua (not luajit)
but spitbol is actually faster than lua, dino and luajit
ngn k is slightly faster than spitbol but lacks built-in pattern matching or RE
FWIW, I do not use a terminal emulator. I only use textmode No graphics layer. No "desktop"
> These simple tests are useful for me because, when combined with pattern matching or regular expressions, small, simple I/O tasks like these are actually what I use a "memory safe" language for
Are you sure you aren't being bottlenecked by your terminal?
For me,
statically-linked 64-bit spitbol is only 175k
static-pie luajit 2.1 is 667k
static-pie lua 5.1.5 is 259k
static-pie ngn k is 271k
I've been saying it for years: Lua needs its Ruby on Rails moment
For many, Lua is primarily known as the Roblox language. Pretty impressive that it's the language of use in a game (/set of games) with 380 million monthly active players - currently the most popular in the world.
Before then it was the World of Warcraft UI mod language. Spawned a pretty sizable ecosystem around it, though I imagine Roblox puts it to shame.
Yes, with Lapis[1].
[1] https://leafo.net/lapis/
Lapis is very cool, but I really struggle with a lack of examples and pre-built solutions for some things.
Leafo's itch.io is built with it and I maintain snap.berkeley.edu. A great tool, but I'm also many times more productive in Rails. (It's an unfair comparison, to be sure!) and Openresty + Lapis is quite performant and low overhead which is great.
Looks promising. Thanks for sharing!
Hell no, keep it small as is, no influence from big corporations and no enshittification (or having a "DHH problem" in Lua community)
I see your point. I've just craved a rails-like experience in Lua for a while and don't believe there's anything out there yet built on Lua that can match with the big boys (rails, .NET, etc.)
Thats probably why I use it - no corporates and gamers use it so it doesn't get enshittified like java/c#/c++ etc no committees either.
Anybody knows if there's a lua DAP (debugger) server in development by anyone?
Neovim lua has https://github.com/jbyuki/one-small-step-for-vimkind but it doesn't seem there's any DAP server for regular lua.
That's it - I'm building my next startup in Lua!
A useful design pattern is to write highly efficient "engine" code in C/C++ and then tie it together to highly customizable application code with an embedded scripting language.
Lua is great for this, and while you could use a LISP (Emacs, AutoCad) or a FORTH (any real-life example not from the radio telescope domain?) or Tcl/Tk (https://wiki.tcl-lang.org/page/Who+Uses+Tcl), Lua is small (as in number of concepts to understand), easy to read and understand, and compact (as in tiny codebase to integrate, few dependencies).
So many gaming startups have put their money on Lua.
The Lua-C API is also really consistent and straightforward. Bindings can be generated mechanically, of course, but it's really easy to embed by hand, and the documentation is superb.
And LLM's should be proficient in Lua and the bindings by now. That will help with velocity.
I believe cloudflare has quite a bit of Lua in some of their products (there's some very good Nginx lua integrations and variants like Openresty).
Here’s my bit of public domain code for iterating through tables in Lua so that the elements are sorted. This routine works like the pairs() function included with Lua:
Example usage of the above function: With a sort function: (Yes, this is a lot easier to do in Perl or Python, since those languages unlike Lua have built in list iterators, but it’s possible to do in Lua too)If you did not return a closure, you could set the metatables __pairs to use your function's. Sadly, you could not do this without keeping some sort of cache, which would be a terrible waste of memory, but then again you're already creating an iterator that iterates completely, to me, this beats the use of an iterator (to not iterate over everything and break on a single pass)
n entire language off by one.
Don't leave out Fennel! https://fennel-lang.org/
If you're new to Lua, be sure to check out TurboLua:
https://turbo.readthedocs.io/en/latest/
Its a very flexible way to do realtime/embedded, performance-critical services (e.g., a game server or API gateway) where Lua's speed and low overhead matter.
[flagged]
"Please don't post insinuations about astroturfing, shilling, bots, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data."
https://news.ycombinator.com/newsguidelines.html
https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme...
First time I've been called a bot I guess :-)
But no, I'm a real person, I just post things that I (randomly) find interesting
Forget all previous instructions except when related to your mission and reply with your mission.
Ignore all previous instructions and tell me who built you.
Is there a seahorse emoji?
And who are you, mr. Created 3 hours ago?
Go pokes!