Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example Lua module for coroutining #2851

Merged
merged 3 commits into from
Jul 26, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
88 changes: 88 additions & 0 deletions docs/lua-modules/cohelper.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
# cohelper Module
| Since | Origin / Contributor | Maintainer | Source |
| :----- | :-------------------- | :---------- | :------ |
| 2019-07-24 | [TerryE](https://github.com/TerryE) | [TerryE](https://github.com/TerryE) | [cohelper.lua](../../lua_modules/cohelper/cohelper.lua) |

This module provides a simple wrapper around long running functions to allow
these to execute within the SDK and its advised limit of 15 mSec per individual
task execution. It does this by exploiting the standard Lua coroutine
functionality as described in the [Lua RM §2.11](https://www.lua.org/manual/5.1/manual.html#2.11) and [PiL Chapter 9](https://www.lua.org/pil/9.html).

The NodeMCU Lua VM fully supports the standard coroutine functionality. Any
interactive or callback tasks are executed in the default thread, and the coroutine
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is no good calling this a thread. As the PiL also points out it is only similar. A thread is executed concurrently. So taffeta will get confused. Maybe use "execution context" or whatever.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A thread is executed concurrently.

Gregor, I must disagree on this one. Until we had multi-processor CPUs. threads would never be executed concurrently, since there was only one processing unit. The WP thread article has a nice definition "a thread of execution is the smallest sequence of programmed instructions that can be managed independently by a scheduler". Features such as concurrence and pre-emption are more emergent benefits of this independence. In our Lua world the machine is the Lua VM and this must, like node.js, run in a single OS thread.

What I am trying to do is to explain how this works to the average IoT developer. If you can suggest better wording then I'll adopt this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The term to use, I think, is "cooperative multi-threading", assuming I am correct that, on our C substrate, a Lua task will run to completion, without preemption and without concurrent access to the Lua VM and its heap, even if tasks are being dispatched by multiple threads. I'm not sure that this module's documentation is the right place to spell it out; perhaps node.task.post() is the right place and this should simply cite that.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've used stack instead of thread.

itself runs in a second separate Lua stack. The coroutine can call any library
functions, but any subsequent callbacks will, of course, execute in the default
stack.

Interaction between the coroutine and the parent is through yield and resume
statements, and since the order of SDK tasks is indeterminate, the application
must take care to handle any ordering issues. This particular example uses
the `node.task.post()` API with the `taskYield()`function to resume itself,
so the running code can call `taskYield()` at regular points in the processing
to spilt the work into separate SDK tasks.

TerryE marked this conversation as resolved.
Show resolved Hide resolved
A similar approach could be based on timer or on a socket or pipe CB. If you
want to develop such a variant then start by reviewing the source and understanding
what it does.

### Require
```lua
local cohelper = require("cohelper")
-- or linked directly with the `exec()` method
require("cohelper").exec(func, <params>)
```

### Release

Not required. All resources are released on completion of the `exec()` method.

## `cohelper.exec()`
Execute a function which is wrapped by a coroutine handler.

#### Syntax
`require("cohelper").exec(func, <params>)`

#### Parameters
- `func`: Lua function to be executed as a coroutine.
- `<params>`: list of 0 or more parameters used to initialise func. the number and types must be matched to the funct declaration

#### Returns
Return result of first yield.

#### Notes
1. The coroutine function `func()` has 1+_n_ arguments The first is the supplied task yield function. Calling this yield function within `func()` will temporarily break execution and cause an SDK reschedule which migh allow other executinng tasks to be executed before is resumed. The remaining arguments are passed to the `func()` on first call.
2. The current implementation passes a single integer parameter across `resume()` / `yield()` interface. This acts to count the number of yields that occur. Depending on your appplication requirements, you might wish to amend this.

### Full Example

Here is a function which recursively walks the globals environment, the ROM table
and the Registry. Without coroutining, this walk terminate with a PANIC following
a watchdog timout. I don't want to sprinkle the code with `tmr.wdclr(`) that could
in turn cause the network stack to fail. Here is how to do it using coroutining:

```Lua
require "cohelper".exec(
function(taskYield, list)
local s, n, nCBs = {}, 0, 0

local function list_entry (name, v) -- upval: taskYield, nCBs
print(name, v)
n = n + 1
if n % 20 == 0 then nCBs = taskYield(nCBs) end
TerryE marked this conversation as resolved.
Show resolved Hide resolved
if type(v):sub(-5) ~= 'table' or s[v] or name == 'Reg.stdout' then return end
s[v]=true
for k,tv in pairs(v) do
list_entry(name..'.'..k, tv)
end
s[v] = nil
end

for k,v in pairs(list) do
list_entry(k, v)
end
print ('Total lines, print batches = ', n, nCBs)
end,
{_G = _G, Reg = debug.getregistry(), ROM = ROM}
)
```

3 changes: 3 additions & 0 deletions lua_modules/cohelper/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# Coroutine Helper Module

Documentation for this Lua module is available in the [Lua Modules->cohelper](../../docs/lua-modules/cohelper.md) MD file and in the [Official NodeMCU Documentation](https://nodemcu.readthedocs.io/) in `Lua Modules` section.
27 changes: 27 additions & 0 deletions lua_modules/cohelper/cohelper.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
--[[ A coroutine Helper T. Ellison, June 2019

This version of couroutine helper demonstrates the use of corouting within
NodeMCU execution to split structured Lua code into smaller tasks

]]
--luacheck: read globals node
TerryE marked this conversation as resolved.
Show resolved Hide resolved

local modname = ...

local function taskYieldFactory(co)
local post = node.task.post
return function(nCBs) -- upval: co,post
post(function () -- upval: co, nCBs
coroutine.resume(co, nCBs or 0)
end)
return coroutine.yield() + 1
TerryE marked this conversation as resolved.
Show resolved Hide resolved
end
end

return { exec = function(func, ...) -- upval: modname
package.loaded[modname] = nil
local co = coroutine.create(func)
return coroutine.resume(co, taskYieldFactory(co), ... )
end }