Skip to content

(No) safe way to wrap taichi kernels in pytorch #11493

(No) safe way to wrap taichi kernels in pytorch

(No) safe way to wrap taichi kernels in pytorch #11493

Workflow file for this run

name: Slash Command Dispatch
on:
issue_comment:
types: [created]
jobs:
check_comments:
runs-on: ubuntu-latest
steps:
- name: Slash Command Dispatch - Triage
uses: peter-evans/slash-command-dispatch@v3
with:
token: ${{ secrets.GARDENER_PAT }}
reaction-token: ${{ secrets.GARDENER_PAT }}
repository: taichi-dev/ci_workflows
issue-type: pull-request
permission: triage
commands: |
rebase
rerun
rerun-failed
- name: Slash Command Dispatch - Repo - Triage
uses: peter-evans/slash-command-dispatch@v3
with:
token: ${{ secrets.GARDENER_PAT }}
reaction-token: ${{ secrets.GARDENER_PAT }}
issue-type: pull-request
permission: triage
commands: |
benchmark
- name: Slash Command Dispatch - Repo - Write
uses: peter-evans/slash-command-dispatch@v3
with:
token: ${{ secrets.GARDENER_PAT }}
reaction-token: ${{ secrets.GARDENER_PAT }}
issue-type: pull-request
permission: write
commands: |
land