This looks like a cool project but there's something I'm confused about.
Is the goal of this project to allow people to run realtime software? if so, isn't using lua a problem because of it's memory management causing GC stalls?
It doesn't appear to be addressed in the README of the github. One other explanation is that I'm missing the point, and it's just using RTOS as a small embeddable OS on which to run lua, is that the case?
I think anything truly real time would be interrupt driven and use C/assembly. This would act as the glue between that critical code. I don't feel like we've hit peak embedded development yet. I love the idea of rapid embedded development that deploys scripts over a network or RS232, but most scripting languages aren't great for this. I would like to see a statically typed actor system with no GC or optional per actor GC that is pre-allocated and can only cause a single actor to fail in a predictable way.
There are languages invented for this purpose, such as Ada. I'd look at that instead of C when you're doing real time systems. Although there are many different kinds of real time systems so it depends on the goal. Soft real time systems aren't as time critical as hard real time systems so the way you program and reason about them is a bit different.
How real time do you need to be? some real time is you need to react within milliseconds or bad things happen, humans will notice if there is 10ms lag in real time audio. Some real time is you need to react within seconds. A machine might need several tens of seconds to get up to speed, so controlling the motor only needs to update a once a second or so for the speed to remain within tolerance. And everything in between.
Depending on where you are on the need for control different technologies can work for you.
Lua's garbage collector can be driven 'manually' quite easily.
That is, one can start it, stop it, run it to completion, run a 'step', tune the size of the step and the aggressiveness of collection, all from within Lua.
It's true that you can't get hard realtime guarantees while using Lua naively, you do have to be aware of what you're doing. If you need to be 'cycle perfect', probably use something else.
But there are an enormous number of applications where what Lua offers is just fine, and there's no reason a Lua program should have GC 'stalls', if that means unexpected and lengthy pauses in execution.
All the real time guarantees would only ever happen in the libraries you are calling out to, not in the scripting language. That's just glue code. If the scripting language has the equivalent of eval() there is no way it can be made real time anyway.
I can't speak for this projevt specifically, but you can do GC in a real time system. IBMs metronome garbage collector is a real time garbage collector.
I do soft real-time in .NET5 without any problems.
I find that if I completely abduct a high-priority thread and never yield back to the OS, things on the critical execution path run incredibly smoothly. I am able to execute a custom high precision timer in one of these loops while experiencing less than a microsecond of jitter. Granted, it will light up an entire core to 100% the whole time the app is running. But, in today's world of 256+ core 1U servers, I think this is a fantastic price to pay for the performance you get as a result. Not yielding keeps your cache very toasty.
Avoiding any allocation is also super important. In .NET, anything touching gen2 or LOH is death sentence. You probably don't even want stuff making it into gen1. Structs and streams are your friends for this adventure.
I'm curious about this as well. And it's not just GC: just allocating memory is not real-time safe unless you're using something like a bump allocator. Lua seems very much like the wrong language for this.
If your heap is on the order of 100kB the GC stalls may not be so bad. A bigger problem may be pulling your code from external SPI flash - typically you will need to put all your real time code in RAM and you have only so much of it.
can you disable the GC? in my last role we had a large C++ application that had embedded lua. I didn't touch it much, but I would have thought that while most of the stuff it did was calling out to our C++ api, the "lua objects" and tables e.t.c. would still be created and need to be garbage collected as normal.
Execution can be interrupted, though, through debug hooks. It could be set up to yield every N instructions. [1]
There's a few caveats, though, in that this will not be called if you've called into C code. That is, you will only yield while executing code in the Lua interpreter.
I thought there was something wrong on that page:L I wanted to try the IDE but just got this video filling the page and that single red button: "sign in with your google account".
Why does an IDE for microcontrollers requires a Google Account?
Guess I'll never know.
Very interesting project on RTOS and hopefully this can support the new RISC-V based ESP32-C3 MCU [1].
I wonder if the performance can significantly be improved if this is ported to Terra language, a system programming language meta-programmed in Lua [2]. It's going to stable 1.0 version real soon.
https://github.com/crmulliner/fluxnode follows the same idea providing a JavaScript runtime for the application development, runs on ESP32 and supports LoRa. Fewer features as this is a hobby project.
Is the goal of this project to allow people to run realtime software? if so, isn't using lua a problem because of it's memory management causing GC stalls?
It doesn't appear to be addressed in the README of the github. One other explanation is that I'm missing the point, and it's just using RTOS as a small embeddable OS on which to run lua, is that the case?
Depending on where you are on the need for control different technologies can work for you.
https://www.ptc.com/en/products/developer-tools/perc
https://www.aicas.com/wp
https://www.microej.com/product/vee/
That is, one can start it, stop it, run it to completion, run a 'step', tune the size of the step and the aggressiveness of collection, all from within Lua.
It's true that you can't get hard realtime guarantees while using Lua naively, you do have to be aware of what you're doing. If you need to be 'cycle perfect', probably use something else.
But there are an enormous number of applications where what Lua offers is just fine, and there's no reason a Lua program should have GC 'stalls', if that means unexpected and lengthy pauses in execution.
This is a really cool project imho.
I find that if I completely abduct a high-priority thread and never yield back to the OS, things on the critical execution path run incredibly smoothly. I am able to execute a custom high precision timer in one of these loops while experiencing less than a microsecond of jitter. Granted, it will light up an entire core to 100% the whole time the app is running. But, in today's world of 256+ core 1U servers, I think this is a fantastic price to pay for the performance you get as a result. Not yielding keeps your cache very toasty.
Avoiding any allocation is also super important. In .NET, anything touching gen2 or LOH is death sentence. You probably don't even want stuff making it into gen1. Structs and streams are your friends for this adventure.
Can you entirely turn off lua's GC?
Deleted Comment
Perhaps even more important is the increased memory use due to Lua. Some devices have very little memory to begin with.
Yet MS-DOS had plenty of programming languages to chose from, when we weren't coding games or demoscene stuff.
This is a non-realtime application running on top of a scheduler that is capable of supporting realtime applications.
Just placing an application on top of an RTOS does not make it realtime.
There's a few caveats, though, in that this will not be called if you've called into C code. That is, you will only yield while executing code in the Lua interpreter.
1. https://pgl.yoyo.org/luai/i/lua_sethook
https://ide.whitecatboard.org/
Why does an IDE for microcontrollers requires a Google Account? Guess I'll never know.
I wonder if the performance can significantly be improved if this is ported to Terra language, a system programming language meta-programmed in Lua [2]. It's going to stable 1.0 version real soon.
[1]https://www.espressif.com/en/news/ESP32_C3
[2]https://terralang.org/