Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps): update dependency semgrep to ~=1.109.0 #746

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Feb 26, 2025

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
semgrep ~=1.107.0 -> ~=1.109.0 age adoption passing confidence

Release Notes

returntocorp/semgrep (semgrep)

v1.109.0

Compare Source

Changed
  • Pyproject.toml files are now parsed using a toml parser (tomli). (sc-2054)
Fixed
  • pro: taint-mode: Fixed limitation in custom taint propagators.
    See https://semgrep.dev/playground/s/ReJQO (code-7967)
  • taint-mode: Disable symbolic-propagation when matching taint propagators
    to prevent unintended interactions. See https://semgrep.dev/playground/s/7KE0k. (code-8054)
  • Fixed pattern match deduplication to avoid an O(n^2) worst-case complexity, and
    optimized the matching of ordered ..., PAT, ... patterns. (saf-682)

v1.108.0

Compare Source

Added
  • pro: Semgrep can now dynamically resolve dependencies for Python projects using pip, allowing it to determine transitive dependencies automatically. (sc-2069)
Changed
  • Bump base Alpine docker image from 3.19 to 3.21. (alpine-version)
  • The semgrep-appsec-platform specific metadata fields "semgrep.dev:" and
    "semgrep.policy:" are now filtered from the JSON output unless you
    are logged in with the Semgrep appsec platform.
    See https://semgrep.dev/docs/semgrep-appsec-platform/json-and-sarif#json for more information. (metadata-filter)
  • The Semgrep Docker image now uses Python 3.12 (bumped from 3.11). (python-version)
Fixed
  • This PR changes the way we handle failures in git worktree remove more gracefully.
    Instead of erroring, we continue to scan so that the user can still get results, but
    log the error. It also adds a guard so that this failure is less likely to happen
    and will include more debugging information when it does. (sms-521)

Configuration

📅 Schedule: Branch creation - "* 0-12 * * 3" (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot requested a review from thypon as a code owner February 26, 2025 01:29
Copy link

[puLL-Merge] - returntocorp/[email protected]

Diff
diff --git .github/workflows/build-test-osx-arm64.jsonnet .github/workflows/build-test-osx-arm64.jsonnet
index 96ce46d68f58..638d9ad0c629 100644
--- .github/workflows/build-test-osx-arm64.jsonnet
+++ .github/workflows/build-test-osx-arm64.jsonnet
@@ -21,7 +21,7 @@ local runs_on = 'macos-latest';
 local setup_python_step =  {
   uses: 'actions/setup-python@v4',
   with: {
-    'python-version': '3.11',
+    'python-version': semgrep.python_version,
   }
 };
 
diff --git .github/workflows/build-test-osx-arm64.yml .github/workflows/build-test-osx-arm64.yml
index f2f438de50a9..b103994b4f32 100644
--- .github/workflows/build-test-osx-arm64.yml
+++ .github/workflows/build-test-osx-arm64.yml
@@ -5,7 +5,7 @@ jobs:
     steps:
       - uses: actions/setup-python@v4
         with:
-          python-version: "3.11"
+          python-version: "3.12"
       - uses: actions/checkout@v3
         with:
           submodules: true
@@ -44,7 +44,7 @@ jobs:
     steps:
       - uses: actions/setup-python@v4
         with:
-          python-version: "3.11"
+          python-version: "3.12"
       - uses: actions/checkout@v3
         with:
           submodules: true
@@ -66,7 +66,7 @@ jobs:
     steps:
       - uses: actions/setup-python@v4
         with:
-          python-version: "3.11"
+          python-version: "3.12"
       - uses: actions/download-artifact@v4
         with:
           name: osx-arm64-wheel
diff --git .github/workflows/build-test-windows-x86.jsonnet .github/workflows/build-test-windows-x86.jsonnet
index e3a5164694f8..decc7ccb5765 100644
--- .github/workflows/build-test-windows-x86.jsonnet
+++ .github/workflows/build-test-windows-x86.jsonnet
@@ -15,6 +15,9 @@ local defaults = {
     shell: 'bash',
   },
 };
+// TODO: We can remove this and switch to semgrep.opam_switch once we move to
+// OCaml 5 everywhere.
+local opam_switch = '5.2.1';
 
 // ----------------------------------------------------------------------------
 // The job
@@ -24,28 +27,38 @@ local build_core_job = {
   // github action, which we need for the latest cohttp and for OCaml 5. Currently,
   // `ocamlfind` fails to build when we run this workflow in CI. The ticket for
   // re-enabling the job is https://linear.app/semgrep/issue/SAF-1728/restore-windows-workflow
-  'if': 'false',
   'runs-on': runs_on,
   defaults: defaults,
   steps: [
     actions.checkout_with_submodules(),
     {
-      uses: 'ocaml/setup-ocaml@v2',
+      uses: 'ocaml/setup-ocaml@v3',
       with: {
-        'ocaml-compiler': '4.14',
-        // we switch from fdopen's opam mingw repo to the official one
-        // otherwise we can't install recent packages like ocamlformat 0.26.2
-        // the opam-repository-mingw has the "sunset" branch because it should
-        // soon be unecessary once opam 2.2 is released.
-        'opam-repositories': |||
-           opam-repository-mingw: https://github.com/ocaml-opam/opam-repository-mingw.git#sunset
-           default: https://github.com/ocaml/opam-repository.git
-        |||,
+        'ocaml-compiler': opam_switch,
         // bogus filename to prevent the action from attempting to install
         // anything (we want deps only)
         'opam-local-packages': 'dont_install_local_packages.opam',
       },
     },
+    {
+      // TODO: Remove this once the stable version of `mingw64-x86_64-openssl`
+      // is updated in Cygwin.
+      //
+      // setup-ocaml@v3 uses a newer version of `mingw64-x86_64-openssl` which
+      // isn't marked as "stable"; see:
+      // https://github.com/ocaml/setup-ocaml/issues/856#issuecomment-2439978460
+      //
+      // But, we need an older version of `mingw64-x86_64-openssl` for our
+      // build since some of our depexts, for instance, `mingw64-x86_64-curl`
+      // would be compiled against the stable (older) version of
+      // `mingw64-x86_64-openssl`. So, we install an older version here.
+      name: 'Install older openssl in Cygwin',
+      run: |||
+        PACKAGES='mingw64-x86_64-openssl=1.0.2u+za-1,mingw64-i686-openssl=1.0.2u+za-1'
+        CYGWIN_ROOT=$(cygpath -w /)
+        $CYGWIN_ROOT/setup-x86_64.exe -P $PACKAGES --quiet-mode -R $CYGWIN_ROOT
+      |||
+    },
     // Why this cache when ocaml/setup-ocaml is already caching things?
     // - setup-ocaml caches the cygwin and downloaded opam packages, but not the
     //   installed opam packages
@@ -54,11 +67,34 @@ local build_core_job = {
     // Note: we must cache after setup-ocaml, not before, because
     // setup-ocaml would reset the cached _opam
     semgrep.cache_opam.step(
-      key=semgrep.opam_switch + "-${{ hashFiles('semgrep.opam') }}",
+      key=opam_switch + "-${{ hashFiles('semgrep-pro.opam', 'OSS/semgrep.opam') }}",
       // ocaml/setup-ocaml creates the opam switch local to the repository
       // (vs. ~/.opam in our other workflows)
       path='_opam',
     ),
+    {
+      // TODO: We can remove this once these flexdll PRs are merged and a new
+      // version of flexdll is released:
+      // - https://github.com/ocaml/flexdll/pull/151
+      // - https://github.com/ocaml/flexdll/pull/152
+
+      // Currently, flexlink only uses response files with MSVC and LIGHTLD
+      // compilers. With the MINGW64 compiler, we get an "argument list too
+      // long error". We use a patched version of flexlink that uses response
+      // files with MINGW64.
+
+      // flexlink also calls cygpath to normalize a bunch of paths, and our
+      // build has too many search paths which causes an "argument list too
+      // long" error. We use a patched flexlink which passes these arguments
+      // in a file to cygpath.
+      name: 'Install flexlink patched to use response files and cygpath -file arg',
+      run: |||
+          git clone -b argument-list-too-long https://github.com/punchagan/flexdll.git
+          cd flexdll/
+          opam exec -- make all MSVC_DETECT=0 CHAINS="mingw64"
+          cp flexlink.exe ../_opam/bin/
+      |||
+    },
     { name: 'Debug stuff',
       run: |||
         ls
@@ -69,8 +105,6 @@ local build_core_job = {
         # CC=x86_64-w64-mingw32-gcc but there is no AR=x86_64-w64-mingw32-ar
         which ar
         ar --version
-        # GHA installs cygwin in a special place
-        export PATH="${CYGWIN_ROOT_BIN}:${PATH}"
         which ar
         ar --version
         which opam
@@ -107,15 +141,23 @@ local build_core_job = {
     {
       name: 'Install OPAM deps',
       run: |||
-        export PATH="${CYGWIN_ROOT_BIN}:${PATH}"
         make install-deps-WINDOWS-for-semgrep-core
+        # NOTE: ocurl's ./configure fails with an error finding curl/curl.h.
+        # Setting PKG_CONFIG_PATH to $(x86_64-w64-mingw32-gcc
+        # -print-sysroot)/mingw/include would set UNIX paths for CFLAG and
+        # LDFLAG, but that doesn't work. Setting Windows PATHs for them gets
+        # the ocurl build to work. To avoid setting these PATHs for all the
+        # package builds, we first try to install all the dependencies, and
+        # then install ocurl and later other dependencies that depend on ocurl.
+        make install-opam-deps || true
+        export CYGWIN_SYS_ROOT="$(x86_64-w64-mingw32-gcc --print-sysroot)"
+        CFLAGS="-I$(cygpath -w $CYGWIN_SYS_ROOT/mingw/include)" LDFLAGS="-L$(cygpath -w $CYGWIN_SYS_ROOT/mingw/lib)" opam install -y ocurl.0.9.1
         make install-opam-deps
       |||,
     },
     {
       name: 'Build semgrep-core',
       run: |||
-        export PATH=\"${CYGWIN_ROOT_BIN}:${PATH}\"
         export TREESITTER_INCDIR=$(pwd)/libs/ocaml-tree-sitter-core/tree-sitter/include
         export TREESITTER_LIBDIR=$(pwd)/libs/ocaml-tree-sitter-core/tree-sitter/lib
         # We have to strip rpath from the tree-sitter projects because there's no
diff --git .github/workflows/build-test-windows-x86.yml .github/workflows/build-test-windows-x86.yml
index 9499652b29d4..868df392d993 100644
--- .github/workflows/build-test-windows-x86.yml
+++ .github/workflows/build-test-windows-x86.yml
@@ -4,26 +4,33 @@ jobs:
     defaults:
       run:
         shell: bash
-    if: "false"
     runs-on: windows-latest
     steps:
       - uses: actions/checkout@v3
         with:
           submodules: true
-      - uses: ocaml/setup-ocaml@v2
+      - uses: ocaml/setup-ocaml@v3
         with:
-          ocaml-compiler: "4.14"
+          ocaml-compiler: 5.2.1
           opam-local-packages: dont_install_local_packages.opam
-          opam-repositories: |
-            opam-repository-mingw: https://github.com/ocaml-opam/opam-repository-mingw.git#sunset
-            default: https://github.com/ocaml/opam-repository.git
+      - name: Install older openssl in Cygwin
+        run: |
+          PACKAGES='mingw64-x86_64-openssl=1.0.2u+za-1,mingw64-i686-openssl=1.0.2u+za-1'
+          CYGWIN_ROOT=$(cygpath -w /)
+          $CYGWIN_ROOT/setup-x86_64.exe -P $PACKAGES --quiet-mode -R $CYGWIN_ROOT
       - env:
           SEGMENT_DOWNLOAD_TIMEOUT_MINS: 2
         name: Set GHA cache for OPAM in _opam
         uses: actions/cache@v3
         with:
-          key: ${{ runner.os }}-${{ runner.arch }}-v1-opam-4.14.0-${{ hashFiles('semgrep.opam') }}
+          key: ${{ runner.os }}-${{ runner.arch }}-v1-opam-5.2.1-${{ hashFiles('semgrep-pro.opam', 'OSS/semgrep.opam') }}
           path: _opam
+      - name: Install flexlink patched to use response files and cygpath -file arg
+        run: |
+          git clone -b argument-list-too-long https://github.com/punchagan/flexdll.git
+          cd flexdll/
+          opam exec -- make all MSVC_DETECT=0 CHAINS="mingw64"
+          cp flexlink.exe ../_opam/bin/
       - name: Debug stuff
         run: |
           ls
@@ -34,8 +41,6 @@ jobs:
           # CC=x86_64-w64-mingw32-gcc but there is no AR=x86_64-w64-mingw32-ar
           which ar
           ar --version
-          # GHA installs cygwin in a special place
-          export PATH="${CYGWIN_ROOT_BIN}:${PATH}"
           which ar
           ar --version
           which opam
@@ -57,12 +62,20 @@ jobs:
           make PREFIX="$prefix" install
       - name: Install OPAM deps
         run: |
-          export PATH="${CYGWIN_ROOT_BIN}:${PATH}"
           make install-deps-WINDOWS-for-semgrep-core
+          # NOTE: ocurl's ./configure fails with an error finding curl/curl.h.
+          # Setting PKG_CONFIG_PATH to $(x86_64-w64-mingw32-gcc
+          # -print-sysroot)/mingw/include would set UNIX paths for CFLAG and
+          # LDFLAG, but that doesn't work. Setting Windows PATHs for them gets
+          # the ocurl build to work. To avoid setting these PATHs for all the
+          # package builds, we first try to install all the dependencies, and
+          # then install ocurl and later other dependencies that depend on ocurl.
+          make install-opam-deps || true
+          export CYGWIN_SYS_ROOT="$(x86_64-w64-mingw32-gcc --print-sysroot)"
+          CFLAGS="-I$(cygpath -w $CYGWIN_SYS_ROOT/mingw/include)" LDFLAGS="-L$(cygpath -w $CYGWIN_SYS_ROOT/mingw/lib)" opam install -y ocurl.0.9.1
           make install-opam-deps
       - name: Build semgrep-core
         run: |
-          export PATH=\"${CYGWIN_ROOT_BIN}:${PATH}\"
           export TREESITTER_INCDIR=$(pwd)/libs/ocaml-tree-sitter-core/tree-sitter/include
           export TREESITTER_LIBDIR=$(pwd)/libs/ocaml-tree-sitter-core/tree-sitter/lib
           # We have to strip rpath from the tree-sitter projects because there's no
diff --git .github/workflows/libs/gha.libsonnet .github/workflows/libs/gha.libsonnet
index 4e41ca93cf53..6299a64430ff 100644
--- .github/workflows/libs/gha.libsonnet
+++ .github/workflows/libs/gha.libsonnet
@@ -9,7 +9,11 @@
     // can be run manually from the GHA dashboard
     workflow_dispatch: null,
     // on the PR
-    pull_request: null,
+    pull_request: {
+      types: ['opened', 'reopened', 'synchronize'],
+      // https://graphite.dev/docs/merge-pull-requests#ignoring-graphites-temporary-branches-in-your-ci
+      'branches-ignore': ['**/graphite-base/**'],
+    },
     // and another time once the PR is merged on develop
     push: {
       branches: [
diff --git .github/workflows/libs/semgrep.libsonnet .github/workflows/libs/semgrep.libsonnet
index ee024fca4601..24732d2a90bd 100644
--- .github/workflows/libs/semgrep.libsonnet
+++ .github/workflows/libs/semgrep.libsonnet
@@ -378,6 +378,7 @@ local setup_nix_step = [
   // build-test-manylinux-x86.jsonnet in pro, tests.jsonnet in OSS
   // TODO? could switch to higher like 3.11
   default_python_version: '3.9',
+  python_version: '3.12',
   containers: containers,
 
   github_bot: github_bot,
diff --git .github/workflows/lint.yml .github/workflows/lint.yml
index f861ef9a6b36..af4a852168f0 100644
--- .github/workflows/lint.yml
+++ .github/workflows/lint.yml
@@ -43,7 +43,13 @@ jobs:
       - uses: pre-commit/[email protected]
 name: lint
 on:
-  pull_request: null
+  pull_request:
+    branches-ignore:
+      - '**/graphite-base/**'
+    types:
+      - opened
+      - reopened
+      - synchronize
   push:
     branches:
       - develop
diff --git .github/workflows/release-homebrew.jsonnet .github/workflows/release-homebrew.jsonnet
index b4ffda07e7c0..e4b4fbe6e03c 100644
--- .github/workflows/release-homebrew.jsonnet
+++ .github/workflows/release-homebrew.jsonnet
@@ -66,14 +66,6 @@ local homebrew_core_pr_job(version) = {
     {
       run: 'brew update',
     },
-    // ugly:  'brew bump-formula-pr' below internally calls
-    // /path/to/python -m pip install -q ... semgrep==1.xxx.yyy
-    // to fetch Semgrep python dependencies from Pypi but this path to python
-    // seems currently broken hence the ugly fix below
-    {
-      name: 'ugly: fix the python path for brew bump-formula-pr',
-      run: 'cd /usr/local/Cellar/[email protected]; ln -s 3.11.6_1 3.11.7'
-    },
     {
       name: 'Dry Run bump semgrep.rb',
       // This step does some brew oddities (setting a fake version, and
diff --git .github/workflows/release-homebrew.yml .github/workflows/release-homebrew.yml
index 0436facc8271..2dcbe8c6c7a8 100644
--- .github/workflows/release-homebrew.yml
+++ .github/workflows/release-homebrew.yml
@@ -19,8 +19,6 @@ jobs:
     runs-on: macos-latest
     steps:
       - run: brew update
-      - name: 'ugly: fix the python path for brew bump-formula-pr'
-        run: cd /usr/local/Cellar/[email protected]; ln -s 3.11.6_1 3.11.7
       - env:
           HOMEBREW_GITHUB_API_TOKEN: ${{ secrets.SEMGREP_HOMEBREW_RELEASE_PAT }}
         if: ${{ inputs.dry-run }}
diff --git .github/workflows/tests.jsonnet .github/workflows/tests.jsonnet
index 44909804b98f..794ef95da706 100644
--- .github/workflows/tests.jsonnet
+++ .github/workflows/tests.jsonnet
@@ -198,6 +198,7 @@ local test_cli_job = {
         '3.9',
         '3.10',
         '3.11',
+        '3.12'
       ],
     },
   },
@@ -257,7 +258,7 @@ local test_qa_job = {
       name: 'Fetch semgrep-cli submodules',
       run: 'git submodule update --init --recursive --recommend-shallow cli/src/semgrep/semgrep_interfaces tests/semgrep-rules',
     },
-    actions.setup_python_step('3.11'),
+    actions.setup_python_step(semgrep.python_version),
     actions.pipenv_install_step,
     download_x86_artifacts,
     install_x86_artifacts,
diff --git .github/workflows/tests.yml .github/workflows/tests.yml
index 07a21b2948c9..6fb2b6abaa86 100644
--- .github/workflows/tests.yml
+++ .github/workflows/tests.yml
@@ -195,6 +195,7 @@ jobs:
           - "3.9"
           - "3.10"
           - "3.11"
+          - "3.12"
   test-osemgrep:
     container: returntocorp/ocaml:alpine-2024-01-18
     env:
@@ -240,7 +241,7 @@ jobs:
       - uses: actions/setup-python@v4
         with:
           cache: pipenv
-          python-version: "3.11"
+          python-version: "3.12"
       - run: pip install pipenv==2024.0.1
       - uses: actions/download-artifact@v4
         with:
diff --git CHANGELOG.md CHANGELOG.md
index 33888951c9be..9f5e239c4747 100644
--- CHANGELOG.md
+++ CHANGELOG.md
@@ -6,6 +6,35 @@
 
 <!-- insertion point -->
 
+## [1.108.0](https://github.com/semgrep/semgrep/releases/tag/v1.108.0) - 2025-02-12
+
+
+### Added
+
+
+- pro: Semgrep can now dynamically resolve dependencies for Python projects using pip, allowing it to determine transitive dependencies automatically. (sc-2069)
+
+
+### Changed
+
+
+- Bump base Alpine docker image from 3.19 to 3.21. (alpine-version)
+- The semgrep-appsec-platform specific metadata fields "semgrep.dev:" and
+  "semgrep.policy:" are now filtered from the JSON output unless you
+  are logged in with the Semgrep appsec platform.
+  See https://semgrep.dev/docs/semgrep-appsec-platform/json-and-sarif#json for more information. (metadata-filter)
+- The Semgrep Docker image now uses Python 3.12 (bumped from 3.11). (python-version)
+
+
+### Fixed
+
+
+- This PR changes the way we handle failures in `git worktree remove` more gracefully.
+  Instead of erroring, we continue to scan so that the user can still get results, but
+  log the error. It also adds a guard so that this failure is less likely to happen
+  and will include more debugging information when it does. (sms-521)
+
+
 ## [1.107.0](https://github.com/semgrep/semgrep/releases/tag/v1.107.0) - 2025-02-04
 
 
diff --git Dockerfile Dockerfile
index 5f31ab8a7bb1..4497571dbee9 100644
--- Dockerfile
+++ Dockerfile
@@ -147,7 +147,7 @@ RUN make install-deps-for-semgrep-core &&\
 # TODO: Update beyond Alpine 3.19 to pick up Python versions newer than 3.11
 
 #coupling: the 'semgrep-oss' name is used in 'make build-docker'
-FROM alpine:3.19 AS semgrep-oss
+FROM alpine:3.21 AS semgrep-oss
 
 WORKDIR /pysemgrep
 
@@ -227,7 +227,10 @@ COPY --from=semgrep-core-container /src/semgrep/_build/default/src/main/Main.exe
 # installed them under /usr/local/lib/python3.xx/site-packages/semgrep/
 RUN ln -s semgrep-core /usr/local/bin/osemgrep && rm -rf /pysemgrep
 
-
+###############################################################################
+# Step2 bis: setup the docker image
+###############################################################################
+# In theory we could do this in a different container
 
 # Let the user know how their container was built
 COPY Dockerfile /Dockerfile
diff --git Makefile Makefile
index 26cd8fe518eb..adcc8810398a 100644
--- Makefile
+++ Makefile
@@ -488,7 +488,12 @@ nix-check-verbose:
 
 # used in build-test-windows-x86.jsonnet
 install-deps-WINDOWS-for-semgrep-core:
-	opam depext $(WINDOWS_OPAM_DEPEXT_DEPS)
+	opam install --depext-only $(WINDOWS_OPAM_DEPEXT_DEPS)
+	# Installing conf-pkg-config *reinstalls* mingw-w64-shims; the PATH changes
+	# done in the shims need to be available when installing other packages
+	# (conf-libcurl, for instance). So, we install conf-pkg-config before other
+	# packages are installed.
+	opam install conf-pkg-config
 
 ###############################################################################
 # Developer targets
diff --git TCB/Cap.ml TCB/Cap.ml
index e53ef2c9aebd..b16a869934bd 100644
--- TCB/Cap.ml
+++ TCB/Cap.ml
@@ -73,8 +73,12 @@ end
 (* FS *)
 (**************************************************************************)
 
-(* TODO: read vs write, specific dir (in_chan or out_chan of opened dir *)
 module FS = struct
+  type readdir = cap
+  type open_r = cap
+  type open_w = cap
+
+  (* TODO: read vs write, specific dir (in_chan or out_chan of opened dir *)
   type root_r = cap
   type root_w = cap
   type root_all_r = cap
@@ -193,6 +197,7 @@ end
  *)
 
 (* fs *)
+type readdir = < readdir : FS.readdir >
 type root = < root_r : FS.root_r ; root_w : FS.root_w >
 type root_all = < root_all_r : FS.root_all_r ; root_all_w : FS.root_all_w >
 type cwd = < cwd_r : FS.cwd_r ; cwd_w : FS.cwd_w >
@@ -203,7 +208,17 @@ type tmp = < tmp : FS.tmp >
 type files_argv =
   < files_argv_r : FS.files_argv_r ; files_argv_w : FS.files_argv_w >
 
-type fs = < root ; root_all ; cwd ; home ; dotfiles ; tmp ; files_argv >
+type fs =
+  < readdir
+  ; open_r : FS.open_r
+  ; open_w : FS.open_w
+  ; root
+  ; root_all
+  ; cwd
+  ; home
+  ; dotfiles
+  ; tmp
+  ; files_argv >
 
 (* console *)
 type stdin = < stdin : Console.stdin >
@@ -254,6 +269,9 @@ let no_caps : no_caps = object end
 let powerbox : all_caps =
   object
     (* fs *)
+    method readdir = ()
+    method open_r = ()
+    method open_w = ()
     method root_r = ()
     method root_w = ()
     method root_all_r = ()
@@ -333,6 +351,11 @@ let exec_and_tmp_caps_UNSAFE () =
     method tmp = ()
   end
 
+let readdir_UNSAFE () =
+  object
+    method readdir = ()
+  end
+
 (**************************************************************************)
 (* Entry point *)
 (**************************************************************************)
diff --git TCB/Cap.mli TCB/Cap.mli
index ad6d4982617a..61afbfda107c 100644
--- TCB/Cap.mli
+++ TCB/Cap.mli
@@ -44,6 +44,13 @@ end
 
 (* read/write on root|cwd|tmp|~|~.xxx| (and files/dirs mentioned in argv) *)
 module FS : sig
+  type readdir
+
+  (* a.k.a open_in and open_out in OCaml world *)
+  type open_r
+  type open_w
+
+  (* TODO: finer-grained readdir and open_r, open_w *)
   type root_r
   type root_w
 
@@ -101,6 +108,7 @@ end
 (**************************************************************************)
 
 (* fs *)
+type readdir = < readdir : FS.readdir >
 type root = < root_r : FS.root_r ; root_w : FS.root_w >
 type root_all = < root_all_r : FS.root_all_r ; root_all_w : FS.root_all_w >
 type cwd = < cwd_r : FS.cwd_r ; cwd_w : FS.cwd_w >
@@ -111,7 +119,17 @@ type tmp = < tmp : FS.tmp >
 type files_argv =
   < files_argv_r : FS.files_argv_r ; files_argv_w : FS.files_argv_w >
 
-type fs = < root ; root_all ; cwd ; home ; dotfiles ; tmp ; files_argv >
+type fs =
+  < readdir
+  ; open_r : FS.open_r
+  ; open_w : FS.open_w
+  ; root
+  ; root_all
+  ; cwd
+  ; home
+  ; dotfiles
+  ; tmp
+  ; files_argv >
 
 (* console *)
 type stdin = < stdin : Console.stdin >
@@ -179,6 +197,7 @@ val tmp_caps_UNSAFE : unit -> < tmp >
 val stdout_caps_UNSAFE : unit -> < stdout >
 val fork_and_limits_caps_UNSAFE : unit -> < fork ; time_limit ; memory_limit >
 val exec_and_tmp_caps_UNSAFE : unit -> < exec ; tmp >
+val readdir_UNSAFE : unit -> < readdir >
 
 (**************************************************************************)
 (* Entry point *)
diff --git TCB/TCB.ml TCB/TCB.ml
index 772031f96bef..439588034ce2 100644
--- TCB/TCB.ml
+++ TCB/TCB.ml
@@ -939,8 +939,13 @@ module Filename = struct
   let is_relative = Filename.is_relative
   let quote = Filename.quote
 
+  (* "." *)
+  let current_dir_name = Filename.current_dir_name
+
+  (* ".." *)
+  let parent_dir_name = Filename.parent_dir_name
+
   (* FORBIDDEN:
-     - current_dir_name, parent_dir_name
      - temp files stuff
      - ...
   *)
diff --git TCB/forbid_everything.jsonnet TCB/forbid_everything.jsonnet
index 34955e678341..d3fd5308cc3c 100644
--- TCB/forbid_everything.jsonnet
+++ TCB/forbid_everything.jsonnet
@@ -16,6 +16,7 @@ local forbid_chdir = import 'forbid_chdir.jsonnet';
 local forbid_tmp = import "forbid_tmp.jsonnet";
 local forbid_console = import 'forbid_console.jsonnet';
 local forbid_process = import 'forbid_process.jsonnet';
+local forbid_fs = import 'forbid_fs.jsonnet';
 local forbid_misc = import 'forbid_misc.jsonnet';
 
 { rules:
@@ -26,5 +27,6 @@ local forbid_misc = import 'forbid_misc.jsonnet';
     forbid_tmp.rules +
     forbid_console.rules +
     forbid_process.rules +
+    forbid_fs.rules +
     forbid_misc.rules
 }
diff --git a/TCB/forbid_fs.jsonnet b/TCB/forbid_fs.jsonnet
new file mode 100644
index 000000000000..eb9601514421
--- /dev/null
+++ TCB/forbid_fs.jsonnet
@@ -0,0 +1,40 @@
+local common = import 'common.libsonnet';
+
+local unix_funcs = [
+  'readdir',
+  //TODO: open_in, open_out, ...
+];
+
+local sys_funcs = [
+  'readdir',
+];
+
+{
+  rules: [
+    {
+      id: 'forbid-fs',
+      match: { any:
+        // Unix
+        [('Unix.' + p) for p in unix_funcs] +
+        [('UUnix.' + p) for p in unix_funcs] +
+        // Sys
+        [('Sys.' + p) for p in sys_funcs] +
+        [('USys.' + p) for p in sys_funcs] +
+        //TODO anything from UFile
+        //TODO Other libs?
+	[]
+      },
+      languages: ['ocaml'],
+      paths: {
+        exclude: common.exclude_paths +
+	['CapFS.ml', 'spacegrep/src/lib/Find_files.ml']
+      },
+      severity: 'ERROR',
+      message: |||
+        Do not use Unix or Sys filesystem functions. Use the
+        safer CapFS module.
+      |||,
+    },
+  ],
+
+}
diff --git cli/setup.py cli/setup.py
index c70da4e5cd02..df6cebd97baf 100644
--- cli/setup.py
+++ cli/setup.py
@@ -138,7 +138,7 @@ def find_executable(env_name, exec_name):
 
 setuptools.setup(
     name="semgrep",
-    version="1.107.0",
+    version="1.108.0",
     author="Semgrep Inc.",
     author_email="[email protected]",
     description="Lightweight static analysis for many languages. Find bug variants with patterns that look like source code.",
diff --git cli/src/semgrep/__init__.py cli/src/semgrep/__init__.py
index c06dd5f46d42..9b5d26233160 100644
--- cli/src/semgrep/__init__.py
+++ cli/src/semgrep/__init__.py
@@ -1 +1 @@
-__VERSION__ = "1.107.0"
+__VERSION__ = "1.108.0"
diff --git cli/src/semgrep/app/scans.py cli/src/semgrep/app/scans.py
index 537e491e9167..294f038cf38b 100644
--- cli/src/semgrep/app/scans.py
+++ cli/src/semgrep/app/scans.py
@@ -139,6 +139,15 @@ def resolve_all_deps_in_diff_scan(self) -> bool:
             return self.scan_response.engine_params.scan_all_deps_in_diff_scan
         return True
 
+    @property
+    def symbol_analysis(self) -> bool:
+        """
+        Collect symbol analysis in scan
+        """
+        if self.scan_response:
+            return self.scan_response.engine_params.symbol_analysis
+        return False
+
     @property
     def ptt_enabled(self) -> bool:
         """
diff --git cli/src/semgrep/commands/ci.py cli/src/semgrep/commands/ci.py
index 9570b7634ce6..744649864bf5 100644
--- cli/src/semgrep/commands/ci.py
+++ cli/src/semgrep/commands/ci.py
@@ -266,7 +266,9 @@ def ci(
     trace: bool,
     trace_endpoint: str,
     use_git_ignore: bool,
+    use_semgrepignore_v2: bool,
     verbose: bool,
+    x_tr: bool,
     path_sensitive: bool,
     allow_local_builds: bool,
     dump_n_rule_partitions: Optional[int],
@@ -666,12 +668,14 @@ def ci(
             "diff_depth": diff_depth,
             "capture_core_stderr": capture_core_stderr,
             "allow_local_builds": allow_local_builds,
+            "x_tr": x_tr,
             "dump_n_rule_partitions": dump_n_rule_partitions,
             "dump_rule_partitions_dir": dump_rule_partitions_dir,
             "ptt_enabled": scan_handler.ptt_enabled if scan_handler else False,
             "resolve_all_deps_in_diff_scan": scan_handler.resolve_all_deps_in_diff_scan
             if scan_handler
             else False,
+            "symbol_analysis": scan_handler.symbol_analysis if scan_handler else False,
         }
 
         try:
@@ -928,6 +932,24 @@ def ci(
                         else:
                             num_nonblocking_findings += 1
 
+            # Before we finish the scan, let's upload our symbol analysis if we have it.
+            # This is "scan-adjacent information", which is information we want to save,
+            # but doesn't really have to do with the meat of the scan (findings, etc).
+            # We upload it separately, and outsource to `osemgrep` so we don't duplicate
+            # the implementation.
+            if (
+                output_extra.core.symbol_analysis is not None
+                and scan_handler.scan_id
+                and token
+            ):
+                logger.debug(
+                    f"Attempting to upload symbol analysis of {len(output_extra.core.symbol_analysis.value)} symbols"
+                )
+                symbol_analysis = output_extra.core.symbol_analysis
+                semgrep.rpc_call.upload_symbol_analysis(
+                    token, scan_handler.scan_id, symbol_analysis
+                )
+
             if not internal_ci_scan_results:
                 output_handler.output(
                     non_cai_matches_by_rule,
diff --git cli/src/semgrep/commands/scan.py cli/src/semgrep/commands/scan.py
index 19918b09570d..3f6adc2efe52 100644
--- cli/src/semgrep/commands/scan.py
+++ cli/src/semgrep/commands/scan.py
@@ -87,6 +87,10 @@ def convert(
 METRICS_STATE_TYPE = MetricsStateType()
 
 # This subset of scan options is reused in ci.py
+# coupling: if you add an option below, you'll need to modify also the
+# list of parameters of scan() further below, of run_scan() in run_scan.py,
+# of ci() in ci.py and adjust run_sca_args in ci.py and the call to
+# semgrep.run_scan.run_scan() in this file.
 _scan_options: List[Callable] = [
     click.help_option("--help", "-h"),
     click.option(
@@ -107,6 +111,14 @@ def convert(
     optgroup.group(
         "Path options",
     ),
+    # temporary option, specific to pysemgrep. Will be removed
+    # once everyone is happy with Semgrepignore v2 (OCaml file targeting)
+    optgroup.option(
+        "--semgrepignore-v2/--no-semgrepignore-v2",
+        "use_semgrepignore_v2",
+        is_flag=True,
+        default=False,
+    ),
     optgroup.option(
         "--exclude",
         multiple=True,
@@ -383,6 +395,12 @@ def convert(
         is_flag=True,
         default=False,
     ),
+    optgroup.option(
+        "--x-tr",
+        "x_tr",
+        is_flag=True,
+        default=False,
+    ),
 ]
 
 
@@ -570,12 +588,14 @@ def scan(
     trace: bool,
     trace_endpoint: Optional[str],
     use_git_ignore: bool,
+    use_semgrepignore_v2: bool,
     validate: bool,
     verbose: bool,
     version: bool,
     x_ignore_semgrepignore_files: bool,
     x_ls: bool,
     x_ls_long: bool,
+    x_tr: bool,
     path_sensitive: bool,
     allow_local_builds: bool,
 ) -> Optional[Tuple[RuleMatchMap, List[SemgrepError], List[Rule], Set[Path]]]:
@@ -849,6 +869,7 @@ def scan(
                         baseline_commit=baseline_commit,
                         x_ls=x_ls,
                         x_ls_long=x_ls_long,
+                        x_tr=x_tr,
                         path_sensitive=path_sensitive,
                         capture_core_stderr=capture_core_stderr,
                         allow_local_builds=allow_local_builds,
diff --git cli/src/semgrep/core_runner.py cli/src/semgrep/core_runner.py
index eb4455b70e5e..5eee282e6cc5 100644
--- cli/src/semgrep/core_runner.py
+++ cli/src/semgrep/core_runner.py
@@ -520,6 +520,7 @@ def __init__(
         allow_untrusted_validators: bool,
         respect_rule_paths: bool = True,
         path_sensitive: bool = False,
+        symbol_analysis: bool = False,
     ):
         self._binary_path = engine_type.get_binary_path()
         self._jobs = jobs or engine_type.default_jobs
@@ -535,6 +536,7 @@ def __init__(
         self._path_sensitive = path_sensitive
         self._respect_rule_paths = respect_rule_paths
         self._capture_stderr = capture_stderr
+        self._symbol_analysis = symbol_analysis
 
     def _extract_core_output(
         self,
@@ -961,6 +963,12 @@ def _run_rules_direct_to_semgrep_core_helper(
             if self._path_sensitive:
                 cmd.append("-path_sensitive")
 
+            # This flag is only in the pro binary, so make sure we're pro
+            # More than that, `symbol_analysis` is only collectible on interfile
+            # scans. So let's only add it if that's the case.
+            if self._symbol_analysis and engine.is_interfile:
+                cmd.append("-symbol_analysis")
+
             # TODO: use exact same command-line arguments so just
             # need to replace the SemgrepCore.path() part.
             if engine.is_pro:
diff --git cli/src/semgrep/dependency_aware_rule.py cli/src/semgrep/dependency_aware_rule.py
index ade74db7d0a3..c2ec25a87e9a 100644
--- cli/src/semgrep/dependency_aware_rule.py
+++ cli/src/semgrep/dependency_aware_rule.py
@@ -10,6 +10,7 @@
 
 from attr import evolve
 
+import semgrep.rpc_call as rpc_call
 import semgrep.semgrep_interfaces.semgrep_output_v1 as out
 from semdep.external.packaging.specifiers import InvalidSpecifier  # type: ignore
 from semdep.external.packaging.specifiers import SpecifierSet  # type: ignore
@@ -68,13 +69,16 @@ def parse_depends_on_yaml(entries: List[Dict[str, str]]) -> Iterator[out.ScaPatt
         )
 
 
+# TODO: should be renamed undetermined_or_unreachable_...
 def generate_unreachable_sca_findings(
     rule: Rule,
     already_reachable: Callable[[Path, FoundDependency], bool],
     resolved_deps: Dict[Ecosystem, List[ResolvedSubproject]],
+    x_tr: bool,
 ) -> Tuple[List[RuleMatch], List[SemgrepError]]:
     """
-    Returns matches to a only a rule's sca-depends-on patterns; ignoring any reachabiliy patterns it has
+    Returns matches to a only a rule's sca-depends-on patterns; ignoring any
+    reachabiliy patterns it has
     """
     depends_on_keys = rule.project_depends_on
     dep_rule_errors: List[SemgrepError] = []
@@ -93,9 +97,10 @@ def generate_unreachable_sca_findings(
             )
             for dep_pat, found_dep in dependency_matches:
                 if found_dep.lockfile_path is None:
-                    # In rare cases, it's possible for a dependency to not have a lockfile
-                    # path. This indicates a dev error and usually means that the parser
-                    # did not associate the dep with a lockfile. So we'll just skip this dependency.
+                    # In rare cases, it's possible for a dependency to not have
+                    # a lockfile path. This indicates a dev error and usually
+                    # means that the parser did not associate the dep with a
+                    # lockfile. So we'll just skip this dependency.
                     logger.warning(
                         f"Found a dependency ({found_dep.package}) without a lockfile path. Skipping..."
                     )
@@ -103,6 +108,9 @@ def generate_unreachable_sca_findings(
 
                 lockfile_path = Path(found_dep.lockfile_path.value)
 
+                # for TR even if we could find a reachable finding in the
+                # 1st party code, we could also investigate the 3rd party code
+                # but let's KISS for now and just consider undetermined findings
                 if already_reachable(lockfile_path, found_dep):
                     continue
 
@@ -111,36 +119,39 @@ def generate_unreachable_sca_findings(
                     found_dependency=found_dep,
                     lockfile=out.Fpath(str(lockfile_path)),
                 )
+                sca_match = out.ScaMatch(
+                    sca_finding_schema=SCA_FINDING_SCHEMA,
+                    reachable=False,
+                    reachability_rule=rule.should_run_on_semgrep_core,
+                    dependency_match=dep_match,
+                )
+                core_match = out.CoreMatch(
+                    check_id=out.RuleId(rule.id),
+                    path=out.Fpath(str(lockfile_path)),
+                    start=out.Position(found_dep.line_number or 1, 1, 1),
+                    end=out.Position(
+                        (found_dep.line_number if found_dep.line_number else 1),
+                        1,
+                        1,
+                    ),
+                    # TODO: we need to define the fields below in
+                    # Output_from_core.atd so we can reuse out.MatchExtra
+                    extra=out.CoreMatchExtra(
+                        metavars=out.Metavars({}),
+                        engine_kind=out.EngineOfFinding(out.OSS()),
+                        is_ignored=False,
+                        sca_match=sca_match,
+                    ),
+                )
+
                 match = RuleMatch(
                     message=rule.message,
                     metadata=rule.metadata,
                     severity=rule.severity,
                     fix=None,
-                    match=out.CoreMatch(
-                        check_id=out.RuleId(rule.id),
-                        path=out.Fpath(str(lockfile_path)),
-                        start=out.Position(found_dep.line_number or 1, 1, 1),
-                        end=out.Position(
-                            (found_dep.line_number if found_dep.line_number else 1),
-                            1,
-                            1,
-                        ),
-                        # TODO: we need to define the fields below in
-                        # Output_from_core.atd so we can reuse out.MatchExtra
-                        extra=out.CoreMatchExtra(
-                            metavars=out.Metavars({}),
-                            engine_kind=out.EngineOfFinding(out.OSS()),
-                            is_ignored=False,
-                        ),
-                    ),
-                    extra={
-                        "sca_info": out.ScaMatch(
-                            sca_finding_schema=SCA_FINDING_SCHEMA,
-                            reachable=False,
-                            reachability_rule=rule.should_run_on_semgrep_core,
-                            dependency_match=dep_match,
-                        )
-                    },
+                    match=core_match,
+                    # TODO: remove, sca_info is now part of core_match
+                    extra={"sca_info": sca_match},
                 )
                 match = evolve(
                     match, match_based_index=match_based_keys[match.match_based_key]
@@ -148,6 +159,16 @@ def generate_unreachable_sca_findings(
                 match_based_keys[match.match_based_key] += 1
                 non_reachable_matches.append(match)
 
+    if x_tr:
+        logger.info(f"SCA TR is on!")
+        transitive_findings = [
+            out.TransitiveFinding(m=rm.match) for rm in non_reachable_matches
+        ]
+        res = rpc_call.transitive_reachability_filter(transitive_findings)
+        logger.info(f"TR result = {res}")
+        # TODO: result is ignored for now but we should reset the
+        # match field of non_reachable_matches and return them
+
     return non_reachable_matches, dep_rule_errors
 
 
diff --git cli/src/semgrep/formatter/text.py cli/src/semgrep/formatter/text.py
index 2e3c09cc1f3b..5459a3e76855 100644
--- cli/src/semgrep/formatter/text.py
+++ cli/src/semgrep/formatter/text.py
@@ -894,7 +894,7 @@ def format(
                 # matches in the output.
                 # Instead, they are sent to the App for LLM validation. We expect this to
                 # be noisy, so we won't print out all of the findings here.
-                if group[1] == "generic-secrets":
+                if group[1] == "generic-secrets" and ctx.is_ci_invocation:
                     url = get_state().env.semgrep_url
                     console.print(
                         textwrap.dedent(
@@ -902,7 +902,7 @@ def format(
                         Your deployment has generic secrets enabled. {len(matches)} potential line locations
                         will be uploaded to the Semgrep platform and then analyzed by Semgrep Assistant.
                         Any findings that appear actionable will be available in the Semgrep Platform.
-                        You can view the secrets analyzed by Assistant at {url}/orgs/-/secrets?status=open&type=AI-detected+secret
+                        You can view the secrets analyzed by Assistant at {url}/orgs/-/secrets?status=open&type=AI-detected+secret+%28beta%29
                         """
                         )
                     )
diff --git cli/src/semgrep/git.py cli/src/semgrep/git.py
index 32f77c909854..cb293e2e3dda 100644
--- cli/src/semgrep/git.py
+++ cli/src/semgrep/git.py
@@ -31,7 +31,10 @@ def zsplit(s: str) -> List[str]:
         return []
 
 
-def git_check_output(command: Sequence[str], cwd: Optional[str] = None) -> str:
+def git_check_output(
+    command: Sequence[str],
+    cwd: Optional[str] = None,
+) -> str:
     """
     Helper function to run a GIT command that prints out helpful debugging information
     """
@@ -167,9 +170,10 @@ class BaselineHandler:
     """
     base_commit: Git ref to compare against
 
-    is_mergebase: Is it safe to assume that the given commit is the mergebase?
-    If not, we have to compute the mergebase ourselves, which can be impossible
-    on shallow checkouts.
+    is_mergebase: Is it safe to assume that the given commit is the merge base?
+    If not, we have to compute the merge base ourselves, which can be impossible
+    on shallow checkouts. A merge base is the most recent common ancestor
+    between two commits.
     """
 
     def __init__(self, base_commit: str, is_mergebase: bool = False) -> None:
@@ -191,6 +195,9 @@ def __init__(self, base_commit: str, is_mergebase: bool = False) -> None:
                 f"Error initializing baseline. While running command {e.cmd} received non-zero exit status of {e.returncode}.\n(stdout)->{e.stdout}\n(strerr)->{e.stderr}"
             )
 
+    def base_commit(self) -> str:
+        return self._base_commit
+
     def _get_git_status(self) -> GitStatus:
         """
         Read and parse git diff output to keep track of all status types
@@ -323,6 +330,33 @@ def _get_git_merge_base(self) -> str:
         else:
             return git_check_output(["git", "merge-base", self._base_commit, "HEAD"])
 
+    def _remove_worktree_with_check(self, worktree_dir: str) -> None:
+        # To help clean up a worktree in a `finally` clause
+        # In most cases, if `git worktree add` fails, we should get
+        # an error anyway, but there's no point in cleaning up a
+        # worktree that we know doesn't exist and this prevents us
+        # from failing if we get an unusual error
+        logger.debug("Checking that the worktree exists")
+        # nosemgrep: use-git-check-output-helper - we should continue when this fails
+        res = subprocess.run(["git", "worktree", "list"], capture_output=True)
+        list_stdout = res.stdout.decode() if res.stdout else "<No stdout>"
+        list_stderr = res.stderr.decode() if res.stderr else "<No stderr>"
+        if res.returncode != 0:
+            logger.debug(
+                f"Error running `git worktree list`:\n----stdout----\n{list_stdout}\n----stderr:----\n{list_stderr}\n`git worktree list` is invoked via a subprocess, this should not be possible"
+            )
+        else:
+            if worktree_dir in list_stdout.strip():
+                logger.debug("Removing the worktree")
+                # nosemgrep: use-git-check-output-helper - we should continue when this fails
+                res = subprocess.run(["git", "worktree", "remove", worktree_dir])
+                remove_stdout = res.stdout.decode() if res.stdout else "<No stdout>"
+                remove_stderr = res.stderr.decode() if res.stderr else "<No stdout>"
+                if res.returncode != 0:
+                    logger.debug(
+                        f"Error cleaning up the git worktree via `git worktree remove`:\n----stdout:---\n{remove_stdout}\n----stderr:----\n{remove_stderr}\n-----git worktree list output\n{list_stdout}"
+                    )
+
     @contextmanager
     def baseline_context(self) -> Iterator[None]:
         """
@@ -375,9 +409,9 @@ def baseline_context(self) -> Iterator[None]:
                 yield
             finally:
                 os.chdir(cwd)
-                logger.debug("Cleaning up git worktree")
-                # Remove the working tree
-                git_check_output(["git", "worktree", "remove", tmpdir])
+                # Cleanup the worktree
+                logger.debug("Cleaning up the worktree")
+                self._remove_worktree_with_check(tmpdir)
                 logger.debug("Finished cleaning up git worktree")
 
     def print_git_log(self) -> None:
diff --git cli/src/semgrep/output.py cli/src/semgrep/output.py
index 1250836027b2..305e0b9900bc 100644
--- cli/src/semgrep/output.py
+++ cli/src/semgrep/output.py
@@ -413,7 +413,11 @@ def output(
         else:
             # ignore log was not created, so the run failed before it even started
             # create a fake log to track the errors
-            self.ignore_log = FileTargetingLog(TargetManager(frozenset([Path(".")])))
+            self.ignore_log = FileTargetingLog(
+                TargetManager(
+                    scanning_root_strings=frozenset([Path(".")]),
+                )
+            )
 
         if extra:
             self.extra = extra
diff --git cli/src/semgrep/resolve_dependency_source.py cli/src/semgrep/resolve_dependency_source.py
index ad5bbb2a63d6..5058e09f3f52 100644
--- cli/src/semgrep/resolve_dependency_source.py
+++ cli/src/semgrep/resolve_dependency_source.py
@@ -85,6 +85,14 @@
     (out.ManifestKind(out.BuildGradle()), None),
     (out.ManifestKind(out.BuildGradle()), out.LockfileKind(out.GradleLockfile())),
     (out.ManifestKind(out.Csproj()), None),
+    (
+        out.ManifestKind(out.RequirementsIn()),
+        out.LockfileKind(out.PipRequirementsTxt()),
+    ),
+    (
+        None,
+        out.LockfileKind(out.PipRequirementsTxt()),
+    ),
 ]
 
 DependencyResolutionResult = Tuple[
@@ -108,7 +116,12 @@ def _resolve_dependencies_rpc(
     """
     Handle the RPC call to resolve dependencies in ocaml
     """
-    response = resolve_dependencies([dependency_source.to_semgrep_output()])
+    try:
+        response = resolve_dependencies([dependency_source.to_semgrep_output()])
+    except Exception as e:
+        logger.verbose(f"RPC call failed: {e}")
+        return None, [], []
+
     if response is None:
         # we failed to resolve somehow
         # TODO: handle this and generate an error
@@ -191,6 +204,8 @@ def _handle_manifest_only_source(
 
 def _handle_multi_lockfile_source(
     dep_source: MultiLockfileDependencySource,
+    enable_dynamic_resolution: bool,
+    ptt_enabled: bool,
 ) -> DependencyResolutionResult:
     """Handle dependency resolution for sources with multiple lockfiles."""
     all_resolved_deps: List[FoundDependency] = []
@@ -200,8 +215,16 @@ def _handle_multi_lockfile_source(
     resolution_methods: Set[ResolutionMethod] = set()
 
     for lockfile_source in dep_source.sources:
+        # We resolve each lockfile source independently.
+        #
+        # NOTE(sal): In the case of dynamic resolution, we should try to resolve all the lockfiles together,
+        #            and then get a single response for all of them. Until then, I explicitly disable
+        #            dynamic resolution and path-to-transitivity (PTT) for multi-lockfile sources. They were
+        #            never enabled in the first place anyway.
         new_resolved_info, new_errors, new_targets = resolve_dependency_source(
-            lockfile_source
+            lockfile_source,
+            enable_dynamic_resolution=False,
+            ptt_enabled=False,
         )
         if new_resolved_info is not None:
             resolution_method, new_deps = new_resolved_info
@@ -253,12 +276,19 @@ def _handle_lockfile_source(
         )
 
         if use_nondynamic_ocaml_parsing or use_dynamic_resolution:
+            logger.verbose(
+                f"Dynamically resolving path(s): {[str(path) for path in dep_source.get_display_paths()]}"
+            )
+
             (
                 new_deps,
                 new_errors,
                 new_targets,
             ) = _resolve_dependencies_rpc(dep_source)
 
+            for error in new_errors:
+                logger.verbose(f"Dynamic resolution RPC error: '{error}'")
+
             if new_deps is not None:
                 # TODO: Reimplement this once more robust error handling for lockfileless resolution is implemented
                 return (
@@ -312,7 +342,11 @@ def resolve_dependency_source(
             ptt_enabled,
         )
     elif isinstance(dep_source, MultiLockfileDependencySource):
-        return _handle_multi_lockfile_source(dep_source)
+        return _handle_multi_lockfile_source(
+            dep_source,
+            enable_dynamic_resolution,
+            ptt_enabled,
+        )
     elif (
         isinstance(dep_source, ManifestOnlyDependencySource)
         and enable_dynamic_resolution
diff --git cli/src/semgrep/resolve_subprojects.py cli/src/semgrep/resolve_subprojects.py
index 1a71001190dd..50775e332617 100644
--- cli/src/semgrep/resolve_subprojects.py
+++ cli/src/semgrep/resolve_subprojects.py
@@ -99,7 +99,7 @@ def filter_changed_subprojects(
     # note that this logic re-implements the logic in `dependency_aware_rule.py`
     for language, ecosystems in ecosystems_by_language.items():
         for code_file in target_manager.get_files_for_language(
-            language, out.Product
+            lang=language, product=out.Product
         ).kept:
             # there may be multiple ecosystems for a single language, and the finding-generation
             # logic will find a different closest subproject for each one. So we need to mark
diff --git cli/src/semgrep/rpc_call.py cli/src/semgrep/rpc_call.py
index f43a887f0b19..5d7b5866f4dc 100644
--- cli/src/semgrep/rpc_call.py
+++ cli/src/semgrep/rpc_call.py
@@ -76,6 +76,37 @@ def resolve_dependencies(
     return ret.value
 
 
+def upload_symbol_analysis(
+    token: str, scan_id: int, symbol_analysis: out.SymbolAnalysis
+) -> None:
+    call = out.FunctionCall(
+        out.CallUploadSymbolAnalysis((token, scan_id, symbol_analysis))
+    )
+    ret: Optional[out.RetUploadSymbolAnalysis] = rpc_call(
+        call, out.RetUploadSymbolAnalysis
+    )
+    if ret is None:
+        logger.warning(
+            "Failed to upload symbol analysis, somehow. Continuing with scan..."
+        )
+    else:
+        logger.debug(f"Uploading symbol analysis succeeded with {ret.value}")
+
+
+def transitive_reachability_filter(
+    args: List[out.TransitiveFinding],
+) -> List[out.TransitiveFinding]:
+    call = out.FunctionCall(out.CallTransitiveReachabilityFilter(args))
+    ret: Optional[out.RetTransitiveReachabilityFilter] = rpc_call(
+        call, out.RetTransitiveReachabilityFilter
+    )
+    if ret is None:
+        logger.warning("failed to filter transitive findings")
+        # return the same findings
+        return args
+    return ret.value
+
+
 def dump_rule_partitions(args: out.DumpRulePartitionsParams) -> bool:
     call = out.FunctionCall(out.CallDumpRulePartitions(args))
     ret: Optional[out.RetDumpRulePartitions] = rpc_call(call, out.RetDumpRulePartitions)
diff --git cli/src/semgrep/run_scan.py cli/src/semgrep/run_scan.py
index db4d7479e76c..dc13aad4992b 100644
--- cli/src/semgrep/run_scan.py
+++ cli/src/semgrep/run_scan.py
@@ -252,6 +252,7 @@ def run_rules(
     allow_local_builds: bool = False,
     ptt_enabled: bool = False,
     resolve_all_deps_in_diff_scan: bool = False,
+    x_tr: bool = False,
 ) -> Tuple[
     RuleMatchMap,
     List[SemgrepError],
@@ -385,9 +386,12 @@ def run_rules(
         for rule in dependency_aware_rules:
             if rule.should_run_on_semgrep_core:
                 # If we have a reachability rule (contains a pattern)
-                # First we check if each match has a lockfile with the correct vulnerability and turn these into SCA findings
-                # Then we generate unreachable findings in all the remaining targeted lockfiles
-                # For each rule, we do not want to generate an unreachable finding in a lockfile
+                # First we check if each match has a lockfile with the correct
+                # vulnerability and turn these into SCA findings
+                # Then we generate unreachable findings in all the remaining
+                # targeted lockfiles
+                # For each rule, we do not want to generate an unreachable
+                # finding in a lockfile
                 # that already has a reachable finding, so we exclude them
                 (
                     dep_rule_matches,
@@ -398,6 +402,7 @@ def run_rules(
                     rule,
                     resolved_subprojects,
                 )
+
                 rule_matches_by_rule[rule] = dep_rule_matches
                 output_handler.handle_semgrep_errors(dep_rule_errors)
                 (
@@ -407,6 +412,7 @@ def run_rules(
                     rule,
                     already_reachable,
                     resolved_subprojects,
+                    x_tr=x_tr,
                 )
                 rule_matches_by_rule[rule].extend(dep_rule_matches)
                 output_handler.handle_semgrep_errors(dep_rule_errors)
@@ -415,12 +421,13 @@ def run_rules(
                     dep_rule_matches,
                     dep_rule_errors,
                 ) = generate_unreachable_sca_findings(
-                    rule, lambda p, d: False, resolved_subprojects
+                    rule, lambda p, d: False, resolved_subprojects, x_tr=False
                 )
                 rule_matches_by_rule[rule] = dep_rule_matches
                 output_handler.handle_semgrep_errors(dep_rule_errors)
 
-        # The caller expects a map from lockfile path to `FoundDependency` items rather than our Subproject representation
+        # The caller expects a map from lockfile path to `FoundDependency` items
+        # rather than our Subproject representation
         deps_by_lockfile: Dict[str, List[FoundDependency]] = {}
         for ecosystem in resolved_subprojects:
             for proj in resolved_subprojects[ecosystem]:
@@ -465,7 +472,7 @@ def run_rules(
 def list_targets_and_exit(
     target_manager: TargetManager, product: out.Product, long_format: bool = False
 ) -> None:
-    targets = target_manager.get_files_for_language(None, product)
+    targets = target_manager.get_files_for_language(lang=None, product=product)
     for path in sorted(targets.kept):
         if long_format:
             print(f"selected {path}")
@@ -530,6 +537,7 @@ def run_scan(
     baseline_commit_is_mergebase: bool = False,
     x_ls: bool = False,
     x_ls_long: bool = False,
+    x_tr: bool = False,
     path_sensitive: bool = False,
     capture_core_stderr: bool = True,
     allow_local_builds: bool = False,
@@ -537,6 +545,7 @@ def run_scan(
     dump_rule_partitions_dir: Optional[Path] = None,
     ptt_enabled: bool = False,
     resolve_all_deps_in_diff_scan: bool = False,
+    symbol_analysis: bool = False,
 ) -> Tuple[
     RuleMatchMap,
     List[SemgrepError],
@@ -769,6 +778,7 @@ def run_scan(
         allow_untrusted_validators=allow_untrusted_validators,
         respect_rule_paths=respect_rule_paths,
         path_sensitive=path_sensitive,
+        symbol_analysis=symbol_analysis,
     )
 
     experimental_rules, normal_rules = partition(
@@ -817,6 +827,7 @@ def run_scan(
         allow_local_builds=allow_local_builds,
         ptt_enabled=ptt_enabled,
         resolve_all_deps_in_diff_scan=resolve_all_deps_in_diff_scan,
+        x_tr=x_tr,
     )
     profiler.save("core_time", core_start_time)
     semgrep_errors: List[SemgrepError] = config_errors + scan_errors
diff --git cli/src/semgrep/semgrep_interfaces cli/src/semgrep/semgrep_interfaces
index 5e0c767ec323..56950a497ea4 160000
--- cli/src/semgrep/semgrep_interfaces
+++ cli/src/semgrep/semgrep_interfaces
@@ -1 +1 @@
-Subproject commit 5e0c767ec323f3f2356d3bf8dbdf7c7836497d8a
+Subproject commit 56950a497ea471d17b0559408342a720c058da3e
diff --git cli/src/semgrep/target_manager.py cli/src/semgrep/target_manager.py
index e4ce3350a96d..98cd89bdd333 100644
--- cli/src/semgrep/target_manager.py
+++ cli/src/semgrep/target_manager.py
@@ -525,7 +525,7 @@ def files_from_filesystem(self) -> Tuple[FrozenSet[Path], FrozenSet[Path]]:
         return (regular_files, insufficient_permissions)
 
     @lru_cache(maxsize=None)
-    def _files(
+    def _target_files(
         self, ignore_baseline_handler: bool = False
     ) -> Tuple[FrozenSet[Path], FrozenSet[Path]]:
         """
@@ -560,24 +560,25 @@ def _files(
 
         return self.files_from_filesystem()
 
-    # cached (see _files())
-    def files(self, ignore_baseline_handler: bool = False) -> FrozenSet[Path]:
-        selected, _insufficient_permissions = self._files(
+    # cached (see _target_files())
+    def target_files(self, ignore_baseline_handler: bool = False) -> FrozenSet[Path]:
+        """Discover target files from the scanning root and cache the result"""
+        selected, _insufficient_permissions = self._target_files(
             ignore_baseline_handler=ignore_baseline_handler
         )
         return selected
 
-    # cached (see _files())
+    # cached (see _target_files())
     def paths_with_insufficient_permissions(
         self, ignore_baseline_handler: bool = False
     ) -> FrozenSet[Path]:
-        _selected, insufficient_permissions = self._files(
+        _selected, insufficient_permissions = self._target_files(
             ignore_baseline_handler=ignore_baseline_handler
         )
         return insufficient_permissions
 
 
-@define(eq=False)
+@define(eq=False, kw_only=True)
 class TargetManager:
     """
     Handles all file include/exclude logic for semgrep
@@ -787,7 +788,7 @@ def get_all_files(self, ignore_baseline_handler: bool = False) -> FrozenSet[Path
         return frozenset(
             f
             for root in self.scanning_roots
-            for f in root.files(ignore_baseline_handler)
+            for f in root.target_files(ignore_baseline_handler)
         )
 
     @lru_cache(maxsize=None)
@@ -808,6 +809,7 @@ def get_paths_with_insufficient_permissions(
     @lru_cache(maxsize=None)
     def get_files_for_language(
         self,
+        *,
         lang: Union[None, Language, Literal["dependency_source_files"]],
         product: out.Product,
         ignore_baseline_handler: bool = False,
@@ -920,7 +922,7 @@ def get_files_for_rule(
         in SCANNING_ROOT will bypass this global INCLUDE/EXCLUDE filter. The local INCLUDE/EXCLUDE
         filter is then applied.
         """
-        paths = self.get_files_for_language(lang, rule_product)
+        paths = self.get_files_for_language(lang=lang, product=rule_product)
 
         if self.respect_rule_paths:
             paths = self.filter_includes(rule_includes, candidates=paths.kept)
@@ -939,6 +941,8 @@ def get_all_dependency_source_files(
         Return all files that might be used as a source of dependency information
         """
         all_files = self.get_files_for_language(
-            "dependency_source_files", out.Product(out.SCA()), ignore_baseline_handler
+            lang="dependency_source_files",
+            product=out.Product(out.SCA()),
+            ignore_baseline_handler=ignore_baseline_handler,
         )
         return all_files.kept
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_dryrun/None/results.txt cli/tests/default/e2e-other/snapshots/test_ci/test_dryrun/None/results.txt
index ef3c68e7b684..54475d7f6ce9 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_dryrun/None/results.txt
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_dryrun/None/results.txt
@@ -212,7 +212,6 @@ Would have sent findings and ignores blob: {
                 "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
             },
             "sca_info": {
-                "reachable": false,
                 "reachability_rule": false,
                 "sca_finding_schema": 20220913,
                 "dependency_match": {
@@ -232,7 +231,8 @@ Would have sent findings and ignores blob: {
                         "children": []
                     },
                     "lockfile": "poetry.lock"
-                }
+                },
+                "reachable": false
             },
             "engine_kind": "OSS"
         },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-azure-pipelines-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-azure-pipelines-overwrite-autodetected-variables/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-azure-pipelines-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-azure-pipelines-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-azure-pipelines/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-azure-pipelines/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-azure-pipelines/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-azure-pipelines/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-bitbucket-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-bitbucket-overwrite-autodetected-variables/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-bitbucket-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-bitbucket-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-bitbucket/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-bitbucket/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-bitbucket/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-bitbucket/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-buildkite-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-buildkite-overwrite-autodetected-variables/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-buildkite-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-buildkite-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-buildkite/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-buildkite/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-buildkite/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-buildkite/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern,_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-circleci-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-circleci-overwrite-autodetected-variables/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-circleci-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-circleci-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-circleci/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-circleci/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-circleci/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-circleci/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-enterprise/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-enterprise/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-enterprise/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-enterprise/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push-special-env-vars/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push-special-env-vars/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push-special-env-vars/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push-special-env-vars/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push-with-app-url/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push-with-app-url/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push-with-app-url/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push-with-app-url/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-github-push/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-gitlab-push/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-gitlab-push/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-gitlab-push/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-gitlab-push/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins-missing-vars/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins-missing-vars/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins-missing-vars/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins-missing-vars/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins-overwrite-autodetected-variables/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-jenkins/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-local/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-local/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-local/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-local/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-self-hosted/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-self-hosted/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-self-hosted/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-self-hosted/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-travis-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-travis-overwrite-autodetected-variables/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-travis-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-travis-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-travis/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-travis/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-travis/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-travis/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-unparsable_url/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-unparsable_url/findings_and_ignores.json
index 2b24c25652db..38e2a22d3222 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-unparsable_url/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/autofix-unparsable_url/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-azure-pipelines-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-azure-pipelines-overwrite-autodetected-variables/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-azure-pipelines-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-azure-pipelines-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-azure-pipelines/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-azure-pipelines/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-azure-pipelines/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-azure-pipelines/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-bitbucket-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-bitbucket-overwrite-autodetected-variables/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-bitbucket-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-bitbucket-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-bitbucket/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-bitbucket/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-bitbucket/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-bitbucket/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-buildkite-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-buildkite-overwrite-autodetected-variables/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-buildkite-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-buildkite-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-buildkite/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-buildkite/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-buildkite/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-buildkite/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-circleci-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-circleci-overwrite-autodetected-variables/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-circleci-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-circleci-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-circleci/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-circleci/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-circleci/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-circleci/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-enterprise/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-enterprise/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-enterprise/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-enterprise/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push-special-env-vars/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push-special-env-vars/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push-special-env-vars/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push-special-env-vars/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push-with-app-url/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push-with-app-url/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push-with-app-url/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push-with-app-url/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-github-push/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-gitlab-push/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-gitlab-push/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-gitlab-push/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-gitlab-push/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins-missing-vars/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins-missing-vars/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins-missing-vars/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins-missing-vars/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins-overwrite-autodetected-variables/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-jenkins/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-local/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-local/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-local/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-local/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-self-hosted/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-self-hosted/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-self-hosted/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-self-hosted/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-travis-overwrite-autodetected-variables/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-travis-overwrite-autodetected-variables/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-travis-overwrite-autodetected-variables/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-travis-overwrite-autodetected-variables/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-travis/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-travis/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-travis/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-travis/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-unparsable_url/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-unparsable_url/findings_and_ignores.json
index 9c55dac1a286..898918de7fb0 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-unparsable_url/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_full_run/noautofix-unparsable_url/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_generic_secrets_output/generic_secrets_and_real_rule/output.txt cli/tests/default/e2e-other/snapshots/test_ci/test_generic_secrets_output/generic_secrets_and_real_rule/output.txt
index c19542363604..886f30e036c7 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_generic_secrets_output/generic_secrets_and_real_rule/output.txt
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_generic_secrets_output/generic_secrets_and_real_rule/output.txt
@@ -101,7 +101,7 @@
   will be uploaded to the Semgrep platform and then analyzed by Semgrep Assistant.
   Any findings that appear actionable will be available in the Semgrep Platform.
   You can view the secrets analyzed by Assistant at
-  https://semgrep.dev/orgs/-/secrets?status=open&type=AI-detected+secret
+  https://semgrep.dev/orgs/-/secrets?status=open&type=AI-detected+secret+%28beta%29
 
 
   BLOCKING CODE RULES FIRED:
diff --git cli/tests/default/e2e-other/snapshots/test_ci/test_lockfile_parse_failure_reporting/findings_and_ignores.json cli/tests/default/e2e-other/snapshots/test_ci/test_lockfile_parse_failure_reporting/findings_and_ignores.json
index 33c92e0f9a33..544f8dc27f64 100644
--- cli/tests/default/e2e-other/snapshots/test_ci/test_lockfile_parse_failure_reporting/findings_and_ignores.json
+++ cli/tests/default/e2e-other/snapshots/test_ci/test_lockfile_parse_failure_reporting/findings_and_ignores.json
@@ -122,7 +122,6 @@
         "pattern_hash": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
       },
       "sca_info": {
-        "reachable": false,
         "reachability_rule": false,
         "sca_finding_schema": 20220913,
         "dependency_match": {
@@ -142,7 +141,8 @@
             "children": []
           },
           "lockfile": "poetry.lock"
-        }
+        },
+        "reachable": false
       },
       "engine_kind": "OSS"
     },
diff --git a/cli/tests/default/e2e/snapshots/test_baseline/test_worktree_state_restored/output.txt b/cli/tests/default/e2e/snapshots/test_baseline/test_worktree_state_restored/output.txt
new file mode 100644
index 000000000000..6882525b385a
--- /dev/null
+++ cli/tests/default/e2e/snapshots/test_baseline/test_worktree_state_restored/output.txt
@@ -0,0 +1,9 @@
+
+
+┌────────────────┐
+│ 1 Code Finding │
+└────────────────┘
+
+    foo.py
+            1┆ x = 23478921
+
diff --git cli/tests/default/e2e/snapshots/test_metavariable_pattern/test1/results.json cli/tests/default/e2e/snapshots/test_metavariable_pattern/test1/results.json
index ed743e48d3b3..ce5f74358848 100644
--- cli/tests/default/e2e/snapshots/test_metavariable_pattern/test1/results.json
+++ cli/tests/default/e2e/snapshots/test_metavariable_pattern/test1/results.json
@@ -46,14 +46,6 @@
             "https://docs.github.com/en/actions/learn-github-actions/security-hardening-for-github-actions#understanding-the-risk-of-script-injections",
             "https://securitylab.github.com/research/github-actions-untrusted-input/"
           ],
-          "semgrep.dev": {
-            "rule": {
-              "origin": "community",
-              "rule_id": "v8UjQj",
-              "url": "https://semgrep.dev/playground/r/K3TPZA/yaml.github-actions.security.run-shell-injection.run-shell-injection",
-              "version_id": "K3TPZA"
-            }
-          },
           "shortlink": "https://sg.run/pkzk",
           "source": "https://semgrep.dev/r/yaml.github-actions.security.run-shell-injection.run-shell-injection",
           "subcategory": [
diff --git cli/tests/default/e2e/snapshots/test_misc/test_deduplication/results.json cli/tests/default/e2e/snapshots/test_misc/test_deduplication/results.json
index b9447f6acaf2..c8045a138a5e 100644
--- cli/tests/default/e2e/snapshots/test_misc/test_deduplication/results.json
+++ cli/tests/default/e2e/snapshots/test_misc/test_deduplication/results.json
@@ -29,11 +29,6 @@
           "cwe": "CWE-522: Insufficiently Protected Credentials",
           "license": "Commons Clause License Condition v1.0[LGPL-2.1-only]",
           "owasp": "A2: Broken Authentication",
-          "semgrep.policy": {
-            "id": 18613,
-            "name": "Rule Board - Block column",
-            "slug": "rule-board-block"
-          },
           "semgrep.ruleset": "ci",
           "semgrep.ruleset_id": 735,
           "semgrep.url": "https://semgrep.dev/r/javascript.jsonwebtoken.security.jwt-hardcode.hardcoded-jwt-secret",
@@ -77,11 +72,6 @@
           "cwe": "CWE-522: Insufficiently Protected Credentials",
           "license": "Commons Clause License Condition v1.0[LGPL-2.1-only]",
           "owasp": "A2: Broken Authentication",
-          "semgrep.policy": {
-            "id": 18613,
-            "name": "Rule Board - Block column",
-            "slug": "rule-board-block"
-          },
           "semgrep.ruleset": "ci",
           "semgrep.ruleset_id": 735,
           "semgrep.url": "https://semgrep.dev/r/javascript.jsonwebtoken.security.jwt-hardcode.hardcoded-jwt-secret",
@@ -125,11 +115,6 @@
           "cwe": "CWE-522: Insufficiently Protected Credentials",
           "license": "Commons Clause License Condition v1.0[LGPL-2.1-only]",
           "owasp": "A2: Broken Authentication",
-          "semgrep.policy": {
-            "id": 18613,
-            "name": "Rule Board - Block column",
-            "slug": "rule-board-block"
-          },
           "semgrep.ruleset": "ci",
           "semgrep.ruleset_id": 735,
           "semgrep.url": "https://semgrep.dev/r/javascript.jsonwebtoken.security.jwt-hardcode.hardcoded-jwt-secret",
diff --git cli/tests/default/e2e/test_baseline.py cli/tests/default/e2e/test_baseline.py
index 04ef11c97a8e..a99e32908918 100644
--- cli/tests/default/e2e/test_baseline.py
+++ cli/tests/default/e2e/test_baseline.py
@@ -150,6 +150,45 @@ def assert_err_match(snapshot, output, snapshot_name, replace_base_commit=None):
     return snapshot.assert_match(textwrap.dedent(err), snapshot_name)
 
 
+@pytest.mark.osemfail
+def test_worktree_state_restored(git_tmp_path, snapshot):
+    # Test when head contains all findings and baseline doesnt contain any
+    foo = git_tmp_path / "foo.py"
+    foo.write_text(f"x = 1")
+
+    # Add baseline files
+    subprocess.run(["git", "add", "."], check=True, capture_output=True)
+    _git_commit(1)
+    base_commit = subprocess.check_output(
+        ["git", "rev-parse", "HEAD"], encoding="utf-8"
+    ).strip()
+
+    # Write a finding
+    foo.write_text(f"x = {SENTINEL_1}\n")
+    subprocess.run(["git", "add", "."], check=True, capture_output=True)
+    _git_commit(2)
+
+    # Save the git worktree state to ensure we clean it up
+    # Note: This test has been arbitrarily chosen to check that the
+    # git worktree state is cleaned up
+    worktree_state_before = subprocess.run(
+        ["git", "worktree", "list"], capture_output=True
+    )
+
+    # Run non-baseline scan and sanity check findings
+    output = run_sentinel_scan()
+    assert_out_match(snapshot, output, "output.txt")
+
+    # Run baseline scan
+    run_sentinel_scan(base_commit=base_commit)
+
+    # Check that the git worktree state was not changed by the scans
+    worktree_state_after = subprocess.run(
+        ["git", "worktree", "list"], capture_output=True
+    )
+    assert worktree_state_before.stdout == worktree_state_after.stdout
+
+
 def test_one_commit_with_baseline(git_tmp_path, snapshot):
     foo = git_tmp_path / "foo.py"
     foo.write_text(f"x = {SENTINEL_1}\n")
diff --git cli/tests/default/unit/targeting/test_exclude.py cli/tests/default/unit/targeting/test_exclude.py
index fab0011af37f..b82c7c8465e4 100644
--- cli/tests/default/unit/targeting/test_exclude.py
+++ cli/tests/default/unit/targeting/test_exclude.py
@@ -190,7 +190,9 @@
     ],
 )
 def test_filter_exclude(patterns, expected_kept):
-    actual = TargetManager(".").filter_excludes(patterns, candidates=CANDIDATES)
+    actual = TargetManager(scanning_root_strings=".").filter_excludes(
+        patterns, candidates=CANDIDATES
+    )
     expected_kept = frozenset(Path(name) for name in expected_kept)
     assert actual.kept == expected_kept
     assert actual.kept == CANDIDATES - actual.removed
@@ -211,10 +213,10 @@ def test_filter_exclude(patterns, expected_kept):
 @pytest.mark.parametrize("pattern_variant", EQUIVALENT_PATTERNS)
 def test_filter_exclude__equivalent_variants(pattern_variant):
     """Test some different variations of the pattern yield the same result."""
-    expected_result = TargetManager(".").filter_excludes(
+    expected_result = TargetManager(scanning_root_strings=".").filter_excludes(
         [EQUIVALENT_PATTERNS[0]], candidates=CANDIDATES
     )
-    actual_result = TargetManager(".").filter_excludes(
+    actual_result = TargetManager(scanning_root_strings=".").filter_excludes(
         [pattern_variant], candidates=CANDIDATES
     )
     assert actual_result == expected_result
diff --git cli/tests/default/unit/targeting/test_include.py cli/tests/default/unit/targeting/test_include.py
index 1b9999998ab6..186de19abdea 100644
--- cli/tests/default/unit/targeting/test_include.py
+++ cli/tests/default/unit/targeting/test_include.py
@@ -128,7 +128,9 @@
     ],
 )
 def test_filter_include(patterns, expected_kept):
-    actual = TargetManager(".").filter_includes(patterns, candidates=CANDIDATES)
+    actual = TargetManager(scanning_root_strings=".").filter_includes(
+        patterns, candidates=CANDIDATES
+    )
     expected_kept = frozenset(Path(name) for name in expected_kept)
     assert actual.kept == expected_kept
     assert actual.removed == CANDIDATES - actual.kept
@@ -149,10 +151,10 @@ def test_filter_include(patterns, expected_kept):
 @pytest.mark.parametrize("pattern_variant", EQUIVALENT_PATTERNS)
 def test_filter_include__equivalent_variants(pattern_variant):
     """Test some different variations of the pattern yield the same result."""
-    expected_result = TargetManager(".").filter_includes(
+    expected_result = TargetManager(scanning_root_strings=".").filter_includes(
         [EQUIVALENT_PATTERNS[0]], candidates=CANDIDATES
     )
-    actual_result = TargetManager(".").filter_includes(
+    actual_result = TargetManager(scanning_root_strings=".").filter_includes(
         [pattern_variant], candidates=CANDIDATES
     )
     assert actual_result == expected_result
diff --git cli/tests/default/unit/targeting/test_target_manager.py cli/tests/default/unit/targeting/test_target_manager.py
index 8e46edad2ffa..5a055b635d56 100644
--- cli/tests/default/unit/targeting/test_target_manager.py
+++ cli/tests/default/unit/targeting/test_target_manager.py
@@ -35,10 +35,12 @@ def test_nonexistent(tmp_path, monkeypatch):
     monkeypatch.chdir(tmp_path)
 
     # shouldnt raise an error
-    TargetManager(["foo/a.py"])
+    TargetManager(scanning_root_strings=["foo/a.py"])
 
     with pytest.raises(InvalidScanningRootError) as e:
-        TargetManager(["foo/a.py", "foo/doesntexist.py"])
+        TargetManager(
+            scanning_root_strings=["foo/a.py", "foo/doesntexist.py"],
+        )
     assert e.value.paths == (Path("foo/doesntexist.py"),)
 
 
@@ -60,7 +62,7 @@ def test_delete_git(tmp_path, monkeypatch):
     foo.unlink()
     subprocess.run(["git", "status"])
 
-    assert_path_sets_equal(ScanningRoot(".", True).files(), {bar})
+    assert_path_sets_equal(ScanningRoot(".", True).target_files(), {bar})
 
 
 @pytest.mark.quick
@@ -179,12 +181,16 @@ def test_get_files_for_language(
 
     if expected is None:
         with pytest.raises(InvalidScanningRootError):
-            target_manager = paths.TargetManager(targets)
+            target_manager = paths.TargetManager(scanning_root_strings=targets)
         return
     else:
-        target_manager = paths.TargetManager(targets)
+        target_manager = paths.TargetManager(
+            scanning_root_strings=targets,
+        )
 
-    actual = target_manager.get_files_for_language(LANG_PY, SAST_PRODUCT).kept
+    actual = target_manager.get_files_for_language(
+        lang=LANG_PY, product=SAST_PRODUCT
+    ).kept
 
     assert_path_sets_equal(actual, getattr(paths, expected))
 
@@ -201,12 +207,16 @@ def test_skip_symlink(tmp_path, monkeypatch):
     PY = Language("python")
 
     assert_path_sets_equal(
-        TargetManager([str(foo)]).get_files_for_language(PY, SAST_PRODUCT).kept,
+        TargetManager(scanning_root_strings=[str(foo)])
+        .get_files_for_language(lang=PY, product=SAST_PRODUCT)
+        .kept,
         {foo / "a.py"},
     )
 
     with pytest.raises(InvalidScanningRootError):
-        TargetManager([str(foo / "link.py")]).get_files_for_language(PY, SAST_PRODUCT)
+        TargetManager(
+            scanning_root_strings=[str(foo / "link.py")],
+        ).get_files_for_language(lang=PY, product=SAST_PRODUCT)
 
 
 @pytest.mark.quick
@@ -220,7 +230,7 @@ def test_ignore_git_dir(tmp_path, monkeypatch):
 
     monkeypatch.chdir(tmp_path)
     language = Language("generic")
-    assert frozenset() == TargetManager([foo]).get_files_for_rule(
+    assert frozenset() == TargetManager(scanning_root_strings=[foo]).get_files_for_rule(
         language, [], [], "dummy_rule_id", SAST_PRODUCT
     )
 
@@ -245,55 +255,63 @@ def test_explicit_path(tmp_path, monkeypatch):
     python_language = Language("python")
 
     assert foo_a in TargetManager(
-        ["foo/a.py"], allow_unknown_extensions=True
+        scanning_root_strings=["foo/a.py"],
+        allow_unknown_extensions=True,
+    ).get_files_for_rule(python_language, [], [], "dummy_rule_id", SAST_PRODUCT)
+    assert foo_a in TargetManager(
+        scanning_root_strings=["foo/a.py"]
     ).get_files_for_rule(python_language, [], [], "dummy_rule_id", SAST_PRODUCT)
-    assert foo_a in TargetManager(["foo/a.py"]).get_files_for_rule(
-        python_language, [], [], "dummy_rule_id", SAST_PRODUCT
-    )
 
     # Should include explicitly passed python file even if is in excludes
     assert foo_a not in TargetManager(
-        ["."], [], {SAST_PRODUCT: ["foo/a.py"]}
+        scanning_root_strings=["."],
+        includes=[],
+        excludes={SAST_PRODUCT: ["foo/a.py"]},
     ).get_files_for_rule(python_language, [], [], "dummy_rule_id", SAST_PRODUCT)
     assert foo_a in TargetManager(
-        [".", "foo/a.py"], [], {SAST_PRODUCT: ["foo/a.py"]}
+        scanning_root_strings=[".", "foo/a.py"],
+        includes=[],
+        excludes={SAST_PRODUCT: ["foo/a.py"]},
     ).get_files_for_rule(python_language, [], [], "dummy_rule_id", SAST_PRODUCT)
 
     # Should ignore expliclty passed .go file when requesting python
     assert (
-        TargetManager(["foo/a.go"]).get_files_for_rule(
-            python_language, [], [], "dummy_rule_id", SAST_PRODUCT
-        )
+        TargetManager(
+            scanning_root_strings=["foo/a.go"],
+        ).get_files_for_rule(python_language, [], [], "dummy_rule_id", SAST_PRODUCT)
         == frozenset()
     )
 
     # Should include explicitly passed file with unknown extension if allow_unknown_extensions=True
     assert_path_sets_equal(
-        TargetManager(["foo/noext"], allow_unknown_extensions=True).get_files_for_rule(
-            python_language, [], [], "dummy_rule_id", SAST_PRODUCT
-        ),
+        TargetManager(
+            scanning_root_strings=["foo/noext"],
+            allow_unknown_extensions=True,
+        ).get_files_for_rule(python_language, [], [], "dummy_rule_id", SAST_PRODUCT),
         {foo_noext},
     )
 
     # Should not include explicitly passed file with unknown extension by default
     assert_path_sets_equal(
-        TargetManager(["foo/noext"]).get_files_for_rule(
-            python_language, [], [], "dummy_rule_id", SAST_PRODUCT
-        ),
+        TargetManager(
+            scanning_root_strings=["foo/noext"],
+        ).get_files_for_rule(python_language, [], [], "dummy_rule_id", SAST_PRODUCT),
         set(),
     )
 
     # Should include explicitly passed file with correct extension even if skip_unknown_extensions=True
     assert_path_sets_equal(
-        TargetManager(["foo/noext", "foo/a.py"]).get_files_for_rule(
-            python_language, [], [], "dummy_rule_id", SAST_PRODUCT
-        ),
+        TargetManager(
+            scanning_root_strings=["foo/noext", "foo/a.py"],
+        ).get_files_for_rule(python_language, [], [], "dummy_rule_id", SAST_PRODUCT),
         {foo_a},
     )
 
     # Should respect includes/excludes passed to get_files even if target explicitly passed
     assert_path_sets_equal(
-        TargetManager(["foo/a.py", "foo/b.py"]).get_files_for_rule(
+        TargetManager(
+            scanning_root_strings=["foo/a.py", "foo/b.py"],
+        ).get_files_for_rule(
             python_language, ["a.py"], [], "dummy_rule_id", SAST_PRODUCT
         ),
         {foo_a},
@@ -302,7 +320,8 @@ def test_explicit_path(tmp_path, monkeypatch):
     # Should respect excludes on a per-product basis
     assert_path_sets_equal(
         TargetManager(
-            ["foo/a.py", "foo/b.py"], excludes={SAST_PRODUCT: ["*.py"]}
+            scanning_root_strings=["foo/a.py", "foo/b.py"],
+            excludes={SAST_PRODUCT: ["*.py"]},
         ).get_files_for_rule(
             python_language, ["a.py"], [], "dummy_rule_id", SECRETS_PRODUCT
         ),
@@ -314,7 +333,7 @@ def test_explicit_path(tmp_path, monkeypatch):
 def test_ignores(tmp_path, monkeypatch):
     def ignore(ignore_pats, profile_product=SAST_PRODUCT, rule_product=SAST_PRODUCT):
         return TargetManager(
-            [tmp_path],
+            scanning_root_strings=[tmp_path],
             ignore_profiles={
                 profile_product: FileIgnore.from_unprocessed_patterns(
                     tmp_path, ignore_pats, max_log_list_entries=0
@@ -445,11 +464,11 @@ def test_unsupported_lang_paths(tmp_path, monkeypatch):
             if os.path.splitext(path)[1] != ".py":
                 expected_unsupported.add(path)
 
-    target_manager = TargetManager(targets)
+    target_manager = TargetManager(scanning_root_strings=targets)
 
-    target_manager.get_files_for_language(LANG_PY, SAST_PRODUCT)
-    target_manager.get_files_for_language(LANG_GENERIC, SAST_PRODUCT)
-    target_manager.get_files_for_language(LANG_REGEX, SAST_PRODUCT)
+    target_manager.get_files_for_language(lang=LANG_PY, product=SAST_PRODUCT)
+    target_manager.get_files_for_language(lang=LANG_GENERIC, product=SAST_PRODUCT)
+    target_manager.get_files_for_language(lang=LANG_REGEX, product=SAST_PRODUCT)
 
     assert_path_sets_equal(
         target_manager.ignore_log.unsupported_lang_paths, expected_unsupported
@@ -481,10 +500,10 @@ def test_unsupported_lang_paths_2(tmp_path, monkeypatch):
             targets.append(str(path))
             expected_unsupported.add(path)
 
-    target_manager = TargetManager(targets)
+    target_manager = TargetManager(scanning_root_strings=targets)
 
-    target_manager.get_files_for_language(LANG_GENERIC, SAST_PRODUCT)
-    target_manager.get_files_for_language(LANG_REGEX, SAST_PRODUCT)
+    target_manager.get_files_for_language(lang=LANG_GENERIC, product=SAST_PRODUCT)
+    target_manager.get_files_for_language(lang=LANG_REGEX, product=SAST_PRODUCT)
 
     assert_path_sets_equal(
         target_manager.ignore_log.unsupported_lang_paths, expected_unsupported
@@ -548,7 +567,7 @@ def test_ignore_baseline_handler(monkeypatch, tmp_path):
     # Call get_files_for_language with ignore_baseline_handler=False
     # Should only return lockfiles in dir_b and dir_c as they were changed after base_commit
     diff_files = target_manager.get_files_for_language(
-        Ecosystem(Pypi()), SCA_PRODUCT, False
+        lang=Ecosystem(Pypi()), product=SCA_PRODUCT, ignore_baseline_handler=False
     ).kept
     assert {str(dir_b_poetry), str(dir_c_poetry)} == {
         str(path) for path in diff_files
@@ -557,7 +576,7 @@ def test_ignore_baseline_handler(monkeypatch, tmp_path):
     # Call get_files_for_language with ignore_baseline_handler=True
     # Should return all three lockfiles
     all_files = target_manager.get_files_for_language(
-        Ecosystem(Pypi()), SCA_PRODUCT, True
+        lang=Ecosystem(Pypi()), product=SCA_PRODUCT, ignore_baseline_handler=True
     ).kept
     assert {str(dir_a_poetry), str(dir_b_poetry), str(dir_c_poetry)} == {
         str(path) for path in all_files
diff --git cli/tests/default/unit/test_filter_changed_subprojects.py cli/tests/default/unit/test_filter_changed_subprojects.py
index 28440cc05cef..ecd371b5497b 100644
--- cli/tests/default/unit/test_filter_changed_subprojects.py
+++ cli/tests/default/unit/test_filter_changed_subprojects.py
@@ -73,7 +73,7 @@ def test_without_baseline(monkeypatch: pytest.MonkeyPatch, tmp_path: Path):
     subprocess.check_call(["git", "commit", "-m", "first"])
 
     # Set up TargetManager
-    target_manager = TargetManager(frozenset([Path(".")]))
+    target_manager = TargetManager(scanning_root_strings=frozenset([Path(".")]))
 
     subprojects = [make_subproject(foo_a, foo_b, out.Ecosystem(out.Pypi()))]
 
@@ -193,7 +193,8 @@ def test_with_baseline__new_code_files(
     # Set up TargetManager
     baseline_handler = BaselineHandler(base_commit, True)
     target_manager = TargetManager(
-        frozenset([Path(".")]), baseline_handler=baseline_handler
+        scanning_root_strings=frozenset([Path(".")]),
+        baseline_handler=baseline_handler,
     )
 
     relevant, irrelevant = filter_changed_subprojects(
@@ -331,7 +332,8 @@ def test_with_baseline__changed_source_files(
     # Set up TargetManager
     baseline_handler = BaselineHandler(base_commit, True)
     target_manager = TargetManager(
-        frozenset([Path(".")]), baseline_handler=baseline_handler
+        scanning_root_strings=frozenset([Path(".")]),
+        baseline_handler=baseline_handler,
     )
 
     relevant, irrelevant = filter_changed_subprojects(
diff --git cli/tests/default/unit/test_resolve_subprojects.py cli/tests/default/unit/test_resolve_subprojects.py
index 3f57ddae90cb..f7fe6a781ef9 100644
--- cli/tests/default/unit/test_resolve_subprojects.py
+++ cli/tests/default/unit/test_resolve_subprojects.py
@@ -13,6 +13,7 @@
 from semgrep.subproject import LockfileOnlyDependencySource
 from semgrep.subproject import ManifestLockfileDependencySource
 from semgrep.subproject import ManifestOnlyDependencySource
+from semgrep.subproject import ResolutionMethod
 from semgrep.subproject import Subproject
 
 
@@ -186,7 +187,7 @@ def test_ptt_unconditionally_generates_dependency_graphs(
     lockfile_file.write("requests==2.25.1")
     lockfile_file.close()
 
-    mock_dynamic_resolve.return_value = ["mock_ecosystem", [], [], []]
+    mock_dynamic_resolve.return_value = [[], [], []]
     dep_source = ManifestLockfileDependencySource(
         manifest=out.Manifest(
             out.ManifestKind(value=out.RequirementsIn()),
@@ -197,7 +198,10 @@ def test_ptt_unconditionally_generates_dependency_graphs(
             out.Fpath(str(tmp_path / "requirements.txt")),
         ),
     )
-    resolve_dependency_source(dep_source, True, True)
+
+    deps, _, _ = resolve_dependency_source(dep_source, True, True)
+    assert deps is not None
+    assert deps[0] == ResolutionMethod.DYNAMIC
 
     mock_dynamic_resolve.mock_assert_called_once_with(
         Path("requirements.txt"), out.ManifestKind(value=out.RequirementsIn())
@@ -205,8 +209,8 @@ def test_ptt_unconditionally_generates_dependency_graphs(
 
 
 @pytest.mark.quick
-@patch("semgrep.resolve_dependency_source._resolve_dependencies_rpc")
 @patch("semdep.parsers.requirements.parse_requirements")
+@patch("semgrep.resolve_dependency_source._resolve_dependencies_rpc")
 def test_ptt_unconditional_graph_generation_falls_back_on_lockfile_parsing(
     mock_dynamic_resolve, mock_parse_requirements, tmp_path: Path
 ) -> None:
@@ -217,7 +221,7 @@ def test_ptt_unconditional_graph_generation_falls_back_on_lockfile_parsing(
     lockfile_file.write("requests==2.25.1")
     lockfile_file.close()
 
-    mock_dynamic_resolve.return_value = [None, [], [], []]
+    mock_dynamic_resolve.return_value = [None, [], []]
     mock_parse_requirements.return_value = (
         [
             out.FoundDependency(
@@ -230,6 +234,7 @@ def test_ptt_unconditional_graph_generation_falls_back_on_lockfile_parsing(
         ],
         [],
     )
+
     dep_source = ManifestLockfileDependencySource(
         manifest=out.Manifest(
             out.ManifestKind(value=out.RequirementsIn()),
@@ -240,7 +245,11 @@ def test_ptt_unconditional_graph_generation_falls_back_on_lockfile_parsing(
             out.Fpath(str(tmp_path / "requirements.txt")),
         ),
     )
-    resolve_dependency_source(dep_source, True, True)
+    deps, _, _ = resolve_dependency_source(dep_source, True, True)
+    assert deps is not None
+    assert deps[0] == ResolutionMethod.LOCKFILE_PARSING
+    assert len(deps[1]) == 1
+    assert deps[1][0].package == "requests"
 
     mock_parse_requirements.mock_assert_called_once_with(
         Path(tmp_path / "requirements.txt"), Path(tmp_path / "requirements.in")
diff --git dune-project dune-project
index f7fe59dd3e0f..dd968e412f1e 100644
--- dune-project
+++ dune-project
@@ -21,7 +21,7 @@
 (generate_opam_files)
 
 ;; set here so the semgrep package below can use it and we can easily bump it
-(version 1.107.0)
+(version 1.108.0)
 
 ;; Default attributes of opam packages
 (source (github semgrep/semgrep))
@@ -521,7 +521,7 @@ For more information see https://semgrep.dev
     (ocurl (= 0.9.1))
     opentelemetry-client-ocurl
     ambient-context-lwt
-    (conf-libcurl (= 1)) ; force older version of conf-libcurl to make windows work
+    conf-libcurl
     ; web stuff
     uri
     (uuidm (>= 0.9.9))
diff --git flake.lock flake.lock
index 204273fcaaff..32b16e4ab41a 100644
--- flake.lock
+++ flake.lock
@@ -70,11 +70,11 @@
     },
     "nixpkgs": {
       "locked": {
-        "lastModified": 1738410390,
-        "narHash": "sha256-xvTo0Aw0+veek7hvEVLzErmJyQkEcRk6PSR4zsRQFEc=",
+        "lastModified": 1739020877,
+        "narHash": "sha256-mIvECo/NNdJJ/bXjNqIh8yeoSjVLAuDuTUzAo7dzs8Y=",
         "owner": "nixos",
         "repo": "nixpkgs",
-        "rev": "3a228057f5b619feb3186e986dbe76278d707b6e",
+        "rev": "a79cfe0ebd24952b580b1cf08cd906354996d547",
         "type": "github"
       },
       "original": {
@@ -157,11 +157,11 @@
     "opam-repository_2": {
       "flake": false,
       "locked": {
-        "lastModified": 1738493790,
-        "narHash": "sha256-b74GR1FnSrvHUOJ3DmUuaPqYEtMsZIfHqr8gEELidsc=",
+        "lastModified": 1739144785,
+        "narHash": "sha256-Z11aL18rw+DZ7bDrwRdjDC0vW7hH+7+QnDET5LLmkjA=",
         "owner": "ocaml",
         "repo": "opam-repository",
-        "rev": "f872de4b7a2f4b7ccc24dfe491575cabe8c9fabe",
+        "rev": "36f4d2d5ceb882a79fd7d7f956b490201da67226",
         "type": "github"
       },
       "original": {
diff --git libs/commons/CapFS.ml libs/commons/CapFS.ml
index e69de29bb2d1..ea3ba9dcba5f 100644
--- libs/commons/CapFS.ml
+++ libs/commons/CapFS.ml
@@ -0,0 +1,29 @@
+open Fpath_.Operators
+
+let readdir _caps = Unix.readdir
+
+(* helpers *)
+
+let with_dir_handle path func =
+  let dir = UUnix.opendir !!path in
+  Common.protect ~finally:(fun () -> UUnix.closedir dir) (fun () -> func dir)
+
+(* Read the names found in a directory, excluding "." and "..". *)
+let read_dir_entries (caps : < Cap.readdir ; .. >) path =
+  with_dir_handle path (fun dir ->
+      let rec loop acc =
+        try
+          (* alt: use Sys.readdir which already filters "." and ".." *)
+          let name = readdir caps#readdir dir in
+          let acc =
+            if
+              name = Filename.current_dir_name (* "." *)
+              || name = Filename.parent_dir_name (* ".." *)
+            then acc
+            else name :: acc
+          in
+          loop acc
+        with
+        | End_of_file -> List.rev acc
+      in
+      loop [])
diff --git libs/commons/CapFS.mli libs/commons/CapFS.mli
index e69de29bb2d1..9f90e52715f0 100644
--- libs/commons/CapFS.mli
+++ libs/commons/CapFS.mli
@@ -0,0 +1,4 @@
+val readdir : Cap.FS.readdir -> Unix.dir_handle -> string
+
+(* Read the names found in a directory, excluding "." and "..". *)
+val read_dir_entries : < Cap.readdir ; .. > -> Fpath.t -> string list
diff --git libs/commons/UFile.ml libs/commons/UFile.ml
index d6cebad59112..0a5d156a1dc0 100644
--- libs/commons/UFile.ml
+++ libs/commons/UFile.ml
@@ -134,27 +134,32 @@ module Legacy = struct
   (** [dir_contents] returns the paths of all regular files that are
  * contained in [dir]. Each file is a path starting with [dir].
   *)
-  let dir_contents dir =
+  let dir_contents ?(strict = false) dir =
     let rec loop result = function
       | f :: fs -> (
           match f with
-          | f when not (USys.file_exists f) ->
-              Log.warn (fun m -> m "%s does not exist anymore" f);
-              loop result fs,
+          | f when not (USys.file_exists f) -> loop result fs
           | f when USys.is_directory f ->
-              USys.readdir f |> Array.to_list
+              let caps = Cap.readdir_UNSAFE () in
+              let entries = CapFS.read_dir_entries caps (Fpath.v f) in
+              entries
               |> List_.map (Filename.concat f)
               |> List.append fs |> loop result
           | f -> loop (f :: result) fs)
       | [] -> result
     in
+    (* only check the existence of the root, and only in strict mode *)
+    if strict then
+      if not (USys.file_exists dir) then
+        invalid_arg
+          (spf "files_of_dirs_or_files_no_vcs_nofilter: %s does not exist" dir);
     loop [] [ dir ]
 
-  let files_of_dirs_or_files_no_vcs_nofilter xs =
+  let files_of_dirs_or_files_no_vcs_nofilter ?strict xs =
     xs
     |> List_.map (fun x ->
            if USys.is_directory x then
-             let files = dir_contents x in
+             let files = dir_contents ?strict x in
              List.filter (fun x -> not (Re.execp vcs_re x)) files
            else [ x ])
     |> List_.flatten
@@ -192,15 +197,15 @@ let file_kind_of_yojson (yojson : Yojson.Safe.t) =
            "Could not convert to Unix.file_kind expected `String, received %s"
            Yojson.Safe.(to_string json))
 
-let files_of_dirs_or_files_no_vcs_nofilter xs =
-  xs |> Fpath_.to_strings |> Legacy.files_of_dirs_or_files_no_vcs_nofilter
+let files_of_dirs_or_files_no_vcs_nofilter ?strict xs =
+  xs |> Fpath_.to_strings
+  |> Legacy.files_of_dirs_or_files_no_vcs_nofilter ?strict
   |> Fpath_.of_strings
 
 let cat path = Legacy.cat !!path
 let cat_array file = "" :: cat file |> Array.of_list
 let write_file ~file data = Legacy.write_file ~file:!!file data
 let read_file ?max_len path = Legacy.read_file ?max_len !!path
-let with_open_out path func = Legacy.with_open_outfile !!path func
 let with_open_in path func = Legacy.with_open_infile !!path func
 
 let filesize file =
@@ -289,6 +294,10 @@ let rec make_directories dir =
       make_directories parent;
       make_directories dir
 
+let with_open_out ?(make_ancestors = false) path func =
+  if make_ancestors then make_directories (Fpath.parent path);
+  Legacy.with_open_outfile !!path func
+
 let find_first_match_with_whole_line path ?split:(chr = '\n') =
   Bos.OS.File.with_ic path @@ fun ic term ->
   let len = in_channel_length ic in
diff --git libs/commons/UFile.mli libs/commons/UFile.mli
index a7d667852ef8..ed0d88d64a59 100644
--- libs/commons/UFile.mli
+++ libs/commons/UFile.mli
@@ -17,9 +17,14 @@ val follow_symlinks : bool ref
 
 (* use the command 'find' internally and tries to skip files in
  * version control system (vcs) (e.g., .git, _darcs, etc.).
+ *
+ * strict: fail hard (Invalid_argument exception) if the paths given
+ * as arguments don't exist.
+ *
  * Deprecated?
  *)
-val files_of_dirs_or_files_no_vcs_nofilter : Fpath.t list -> Fpath.t list
+val files_of_dirs_or_files_no_vcs_nofilter :
+  ?strict:bool -> Fpath.t list -> Fpath.t list
 
 (*****************************************************************************)
 (* IO *)
@@ -67,7 +72,17 @@ val read_file : ?max_len:int -> Fpath.t -> string
  *     pr "this goes in foo.txt"
  *   )
  *)
-val with_open_out : Fpath.t -> ((string -> unit) * out_channel -> 'a) -> 'a
+val with_open_out :
+  ?make_ancestors:bool ->
+  Fpath.t ->
+  ((string -> unit) * out_channel -> 'a) ->
+  'a
+(** [with_open_out ~make_ancestors path f] opens, creating if necessary, [path]
+    and applies [f] to the resulting [out_channel].
+
+    If [make_ancestors] is specified and true, it creates any necessary
+    ancestory directories too. *)
+
 val with_open_in : Fpath.t -> (in_channel -> 'a) -> 'a
 
 val find_first_match_with_whole_line :
@@ -152,7 +167,7 @@ val make_directories : Fpath.t -> unit
 (* Deprecated! *)
 module Legacy : sig
   val files_of_dirs_or_files_no_vcs_nofilter :
-    string (* root *) list -> string (* filename *) list
+    ?strict:bool -> string (* root *) list -> string (* filename *) list
 
   val cat : string (* filename *) -> string list
   val write_file : file:string (* filename *) -> string -> unit
@@ -164,6 +179,7 @@ module Legacy : sig
   val with_open_infile : string (* filename *) -> (in_channel -> 'a) -> 'a
 
   (* NOT IN MAIN API *)
-  val dir_contents : string (* filename *) -> string (* filename *) list
+  val dir_contents :
+    ?strict:bool -> string (* filename *) -> string (* filename *) list
   (** [dir_contents dir] will return a recursive list of all files in a dir *)
 end
diff --git libs/commons/Uri_.ml libs/commons/Uri_.ml
index da85829d9a76..a1c750f7ea8e 100644
--- libs/commons/Uri_.ml
+++ libs/commons/Uri_.ml
@@ -28,3 +28,6 @@ let show (uri : Uri.t) : string = Fmt_.to_show Uri.pp uri
 let of_string_opt (str : string) : Uri.t option =
   let uri = Uri.of_string str in
   if Uri.equal uri Uri.empty then None else Some uri
+
+let of_fpath (file : Fpath.t) : Uri.t =
+  Uri.make ~scheme:"file" ~host:"" ~path:(Fpath.to_string file) ()
diff --git libs/commons/Uri_.mli libs/commons/Uri_.mli
index 77c18e1bf8ba..4e8abd1a98c5 100644
--- libs/commons/Uri_.mli
+++ libs/commons/Uri_.mli
@@ -5,3 +5,4 @@ val of_string_opt : string -> Uri.t option
 
 (* rely on Uri.pp *)
 val show : Uri.t -> string
+val of_fpath : Fpath.t -> Uri.t
diff --git libs/commons2/common2.ml libs/commons2/common2.ml
index 6fa0dd50d107..df95aef9752a 100644
--- libs/commons2/common2.ml
+++ libs/commons2/common2.ml
@@ -2510,8 +2510,8 @@ let capsule_unix f args =
         (Printf.sprintf "exn Unix_error: %s %s %s\n" (Unix.error_message e) fm
            argm)
 
-let (readdir_to_kind_list : string -> Unix.file_kind -> string list) =
- fun path kind ->
+(*
+let readdir_to_kind_list (path : string) (kind : Unix.file_kind) : string list =
   USys.readdir path |> Array.to_list
   |> List.filter (fun s ->
          try
@@ -2537,6 +2537,7 @@ let (readdir_to_dir_size_list : string -> (string * int) list) =
   |> List_.filter_map (fun s ->
          let stat = UUnix.lstat (path ^ "/" ^ s) in
          if stat.st_kind =*= Unix.S_DIR then Some (s, stat.st_size) else None)
+*)
 
 let unixname () =
   let uid = UUnix.getuid () in
diff --git libs/commons2/common2.mli libs/commons2/common2.mli
index ddd7582b8b24..7e2102d2a630 100644
--- libs/commons2/common2.mli
+++ libs/commons2/common2.mli
@@ -761,11 +761,15 @@ val is_directory_eff : path -> bool
 val is_file_eff : path -> bool
 val is_executable_eff : filename -> bool
 val capsule_unix : ('a -> unit) -> 'a -> unit
-val readdir_to_kind_list : string -> Unix.file_kind -> string list
-val readdir_to_dir_list : string -> dirname list
-val readdir_to_file_list : string -> filename list
-val readdir_to_link_list : string -> string list
-val readdir_to_dir_size_list : string -> (string * int) list
+
+(* deprecated, should use CapFS or libs/paths/List_files.mli
+   val readdir_to_kind_list : string -> Unix.file_kind -> string list
+   val readdir_to_dir_list : string -> dirname list
+   val readdir_to_file_list : string -> filename list
+   val readdir_to_link_list : string -> string list
+   val readdir_to_dir_size_list : string -> (string * int) list
+*)
+
 val unixname : unit -> string
 
 val glob : string -> filename list
diff --git libs/ograph/ograph_call_dot_gv.ml libs/ograph/ograph_call_dot_gv.ml
index d6b06d2ec23d..ce7c46724e15 100644
--- libs/ograph/ograph_call_dot_gv.ml
+++ libs/ograph/ograph_call_dot_gv.ml
@@ -3,8 +3,7 @@ open Common
 (*****************************************************************************)
 (* Prelude *)
 (*****************************************************************************)
-(* Call 'dot', 'gv', or 'open' to display a graph
- *)
+(* Call 'dot', 'gv', or 'open' to display a graph *)
 
 (*****************************************************************************)
 (* Dot generation *)
@@ -65,13 +64,13 @@ let generate_ograph_xxx g filename =
 (* TODO: switch from cmd_to_list to UCmd.status_of_run with
  * properly built Cmd, or even switch to CapExec!
  *)
-let launch_png_cmd (caps : < Cap.exec >) filename =
+let launch_png_cmd (caps : < Cap.exec ; .. >) filename =
   CapExec.cmd_to_list caps#exec (spf "dot -Tpng %s -o %s.png" filename filename)
   |> ignore;
   CapExec.cmd_to_list caps#exec (spf "open %s.png" filename) |> ignore;
   ()
 
-let launch_gv_cmd (caps : < Cap.exec >) filename =
+let launch_gv_cmd (caps : < Cap.exec ; .. >) filename =
   CapExec.cmd_to_list caps#exec
     ("dot " ^ filename ^ " -Tps  -o " ^ filename ^ ".ps;")
   |> ignore;
@@ -82,7 +81,7 @@ let launch_gv_cmd (caps : < Cap.exec >) filename =
    *)
   ()
 
-let display_graph_cmd (caps : < Cap.exec >) filename =
+let display_graph_cmd (caps : < Cap.exec ; .. >) filename =
   match Platform.kernel caps with
   | Platform.Darwin -> launch_png_cmd caps filename
   | Platform.Linux -> launch_gv_cmd caps filename
@@ -96,7 +95,8 @@ let print_ograph_mutable caps g filename display_graph =
   generate_ograph_xxx g filename;
   if display_graph then display_graph_cmd caps filename
 
-let print_ograph_mutable_generic caps ?title ?(display_graph = true)
+let print_ograph_mutable_generic (caps : < Cap.exec >) ?title
+    ?(display_graph = true)
     ?(output_file = Fpath.(UTmp.get_temp_dir_name () / "ograph.dot")) ~s_of_node
     g =
   UFile.with_open_out output_file (fun (_, oc) ->
@@ -104,7 +104,8 @@ let print_ograph_mutable_generic caps ?title ?(display_graph = true)
       generate_ograph_generic g title s_of_node f);
   if display_graph then display_graph_cmd caps (Fpath.to_string output_file)
 
-let pp_ograph_mutable_generic caps ?title ~s_of_node f g : unit =
+let pp_ograph_mutable_generic (caps : < Cap.exec ; .. >) ?title ~s_of_node f g :
+    unit =
   CapTmp.with_temp_file caps#tmp (fun tmp ->
       (* Write dot code *)
       UFile.with_open_out tmp (fun (_, oc) ->
diff --git libs/ograph/ograph_call_dot_gv.mli libs/ograph/ograph_call_dot_gv.mli
index 22b6e0417de9..742c53cd4f1d 100644
--- libs/ograph/ograph_call_dot_gv.mli
+++ libs/ograph/ograph_call_dot_gv.mli
@@ -5,6 +5,7 @@
  *)
 open Ograph_extended
 
+(* TODO: not sure why but I can't make it < Cap.exec; ..> below *)
 val print_ograph_mutable_generic :
   < Cap.exec > ->
   ?title:string ->
@@ -17,7 +18,7 @@ val print_ograph_mutable_generic :
   unit
 
 val pp_ograph_mutable_generic :
-  < Cap.exec ; Cap.tmp > ->
+  < Cap.exec ; Cap.tmp ; .. > ->
   ?title:string ->
   s_of_node:(nodei * 'node -> string * string option * string option) ->
   Format.formatter ->
@@ -38,4 +39,4 @@ val print_ograph_mutable :
   bool (* launch gv / show png ? *) ->
   unit
 
-val launch_gv_cmd : < Cap.exec > -> string (* filename *) -> unit
+val launch_gv_cmd : < Cap.exec ; .. > -> string (* filename *) -> unit
diff --git libs/ojsonnet/Unit_jsonnet.ml libs/ojsonnet/Unit_jsonnet.ml
index a466c96af4c9..28c6101dc56f 100644
--- libs/ojsonnet/Unit_jsonnet.ml
+++ libs/ojsonnet/Unit_jsonnet.ml
@@ -64,6 +64,7 @@ let mk_tests (caps : < Cap.time_limit >) (subdir : string)
                          to take over 1s on my machine without me touching
                          ojsonnet's code. *)
                       let timeout = 2.0 in
+                      let t1 = Unix.gettimeofday () in
                       let json_opt =
                         Common.save_excursion Conf.eval_strategy strategy
                           (fun () ->
@@ -76,9 +77,14 @@ let mk_tests (caps : < Cap.time_limit >) (subdir : string)
                       in
                       match json_opt with
                       | None ->
+                          let t2 = Unix.gettimeofday () in
+                          let dt = t2 -. t1 in
                           failwith
-                            (spf "%gs timeout on %s with %s" timeout !!file
-                               str_strategy)
+                            (spf
+                               "%.3fs (%gs) timeout on %s with %s - sometimes \
+                                happens when running the tests with excessive \
+                                parallelism"
+                               dt timeout !!file str_strategy)
                       | Some json ->
                           if not (Y.equal json expected) then
                           ,  failwith
diff --git libs/paths/List_files.ml libs/paths/List_files.ml
index 0cd2cbb8bd48..455efd7f251a 100644
--- libs/paths/List_files.ml
+++ libs/paths/List_files.ml
@@ -28,43 +28,18 @@ module Log = Log_paths.Log
 (* Helpers *)
 (*************************************************************************)
 
-let with_dir_handle path func =
-  let dir = Unix.opendir !!path in
-  Common.protect ~finally:(fun () -> Unix.closedir dir) (fun () -> func dir)
+let rec iter_dir_entries caps func dir names =
+  List.iter (iter_dir_entry caps func dir) names
 
-(* Read the names found in a directory, excluding "." and "..". *)
-let read_dir_entries path =
-  with_dir_handle path (fun dir ->
-      let rec loop acc =
-        try
-          let name = Unix.readdir dir in
-          let acc =
-            if
-              name = Filename.current_dir_name (* "." *)
-              || name = Filename.parent_dir_name (* ".." *)
-            then acc
-            else name :: acc
-          in
-          loop acc
-        with
-        | End_of_file -> List.rev acc
-      in
-      loop [])
-
-let read_dir_entries_fpath path = read_dir_entries path |> List_.map Fpath.v
-
-let rec iter_dir_entries func dir names =
-  List.iter (iter_dir_entry func dir) names
-
-and iter_dir_entry func dir name =
-  let path = Fpath.add_seg dir name in
-  iter func path
+and iter_dir_entry caps func dir name =
+  let path = dir / name in
+  iter caps func path
 
 (*************************************************************************)
 (* Entry points *)
 (*************************************************************************)
 
-and iter func path =
+and iter caps func path =
   let stat =
     try Some (Unix.lstat !!path) with
     | Unix.Unix_error (_error_kind, _func, _info) ->
@@ -72,27 +47,27 @@ and iter func path =
         None
   in
   match stat with
-  | Some { Unix.st_kind = S_DIR; _ } -> iter_dir func path
+  | Some { Unix.st_kind = S_DIR; _ } -> iter_dir caps func path
   | Some stat (* regular file, symlink, etc. *) -> func path stat
   | None -> ()
 
-and iter_dir func dir =
-  let names = read_dir_entries dir in
-  iter_dir_entries func dir names
+and iter_dir caps func dir =
+  let names = CapFS.read_dir_entries caps dir in
+  iter_dir_entries caps func dir names
 
-let fold_left func init path =
+let fold_left caps func init path =
   let acc = ref init in
-  iter (fun path stat -> acc := func !acc path stat) path;
+  iter caps (fun path stat -> acc := func !acc path stat) path;
   !acc
 
-let list_with_stat path =
-  fold_left (fun acc path stat -> (path, stat) :: acc) [] path |> List.rev
+let list_with_stat caps path =
+  fold_left caps (fun acc path stat -> (path, stat) :: acc) [] path |> List.rev
 
-let list path = list_with_stat path |> List_.map fst
+let list caps path = list_with_stat caps path |> List_.map fst
 
 (* python: Target.files_from_filesystem *)
-let list_regular_files ?(keep_root = false) root_path =
-  list_with_stat root_path
+let list_regular_files ?(keep_root = false) caps root_path =
+  list_with_stat caps root_path
   |> List_.filter_map (fun (path, (stat : Unix.stats)) ->
          Log.debug (fun m -> m "root: %s path: %s" !!root_path !!path);
          if keep_root && path = root_path then Some path
diff --git libs/paths/List_files.mli libs/paths/List_files.mli
index 3cbb67d0ca2c..b75e516ea699 100644
--- libs/paths/List_files.mli
+++ libs/paths/List_files.mli
@@ -9,8 +9,9 @@
 (*
    List all files recursively. Exclude folders/directories.
    For further filtering based on file type, use 'list_with_stat'.
+   ex: [list caps "a/" --> ["a/foo.txt"; "a/bar.txt"; "a/b/foobar.txt"]]
 *)
-val list : Fpath.t -> Fpath.t list
+val list : < Cap.readdir ; .. > -> Fpath.t -> Fpath.t list
 
 (*
    List all regular files recursively. This excludes symlinks, among other.
@@ -27,26 +28,25 @@ val list : Fpath.t -> Fpath.t list
    Moreover while traversing dirs, list_regular_files ignores all
    Unix.Unix_error exceptions raised by Unix.lstat.
 *)
-val list_regular_files : ?keep_root:bool -> Fpath.t -> Fpath.t list
+val list_regular_files :
+  ?keep_root:bool -> < Cap.readdir ; .. > -> Fpath.t -> Fpath.t list
 
 (*
    List all files recursively. Exclude folders/directories.
    Use List_.map_filter to exclude more file types.
 *)
-val list_with_stat : Fpath.t -> (Fpath.t * Unix.stats) list
+val list_with_stat :
+  < Cap.readdir ; .. > -> Fpath.t -> (Fpath.t * Unix.stats) list
 
 (*
    Iterate over files recursively. Exclude folders/directories.
 *)
 val fold_left :
-  ('acc -> Fpath.t -> Unix.stats -> 'acc) -> 'acc -> Fpath.t -> 'acc
-
-val iter : (Fpath.t -> Unix.stats -> unit) -> Fpath.t -> unit
-
-(* internals *)
-
-(* Read the names found in a directory, excluding "." and "..". *)
-val read_dir_entries : Fpath.t -> string list
-
-(* same than read_dir_entries but return single segment Fpath.t *)
-val read_dir_entries_fpath : Fpath.t -> Fpath.t list
+  < Cap.readdir ; .. > ->
+  ('acc -> Fpath.t -> Unix.stats -> 'acc) ->
+  'acc ->
+  Fpath.t ->
+  'acc
+
+val iter :
+  < Cap.readdir ; .. > -> (Fpath.t -> Unix.stats -> unit) -> Fpath.t -> unit
diff --git libs/paths/Unit_list_files.ml libs/paths/Unit_list_files.ml
index fafdde7f4a0b..4f7e585da628 100644
--- libs/paths/Unit_list_files.ml
+++ libs/paths/Unit_list_files.ml
@@ -8,16 +8,17 @@ module TP = Testutil_paths
 
 let t = Testo.create
 
-let test_regular_file_as_root () =
+let test_regular_file_as_root (caps : < Cap.readdir ; .. >) () =
   TP.with_file_tree
     (File ("hello", Regular "yo"))
     (fun workspace ->
-      assert (List_files.list (workspace / "hello") = [ workspace / "hello" ]))
+      assert (
+        List_files.list caps (workspace / "hello") = [ workspace / "hello" ]))
 
-let test_empty_dir_as_root () =
+let test_empty_dir_as_root (caps : < Cap.readdir ; .. >) () =
   TP.with_file_tree
     (Dir ("empty", []))
-    (fun workspace -> assert (List_files.list (workspace / "empty") = []))
+    (fun workspace -> assert (List_files.list caps (workspace / "empty") = []))
 
 (* Because file listings are not guaranteed to be in any particular order. *)
 let compare_path_lists expected actual =
@@ -26,7 +27,7 @@ let compare_path_lists expected actual =
   in
   Alcotest.(check string) "equal" (sort expected) (sort actual)
 
-let test_regular_files () =
+let test_regular_files (caps : < Cap.readdir ; .. >) () =
   with_file_tree
     (Dir
        ( "root",
@@ -42,9 +43,9 @@ let test_regular_files () =
           workspace / "root" / "b";
           workspace / "root" / "c" / "d";
         ]
-        (List_files.list workspace))
+        (List_files.list caps workspace))
 
-let test_symlinks () =
+let test_symlinks (caps : < Cap.readdir ; .. >) () =
   with_file_tree
     (Dir
        ( "root",
@@ -60,9 +61,9 @@ let test_symlinks () =
           workspace / "root" / "b";
           workspace / "root" / "c";
         ]
-        (List_files.list workspace))
+        (List_files.list caps workspace))
 
-let test_ignore_symlinks () =
+let test_ignore_symlinks (caps : < Cap.readdir ; .. >) () =
   with_file_tree
     (Dir
        ( "root",
@@ -74,26 +75,26 @@ let test_ignore_symlinks () =
     (fun workspace ->
       compare_path_lists
         [ workspace / "root" / "a" ]
-        (List_files.list_regular_files workspace))
+        (List_files.list_regular_files caps workspace))
 
-let test_symlink_as_root () =
+let test_symlink_as_root (caps : < Cap.readdir ; .. >) () =
   with_file_tree
     (File ("a", Symlink "b"))
     (fun workspace ->
       let root_path = workspace / "a" in
       compare_path_lists [ root_path ]
-        (List_files.list_regular_files ~keep_root:true root_path))
+        (List_files.list_regular_files ~keep_root:true caps root_path))
 
-let tests =
+let tests (caps : < Cap.readdir ; .. >) =
   Testo.categorize_suites "List_files"
     [
       Testo.categorize "list"
         [
-          t "regular_file_as_root" test_regular_file_as_root;
-          t "empty_dir_as_root" test_empty_dir_as_root;
-          t "regular_files" test_regular_files;
-          t "symlinks" test_symlinks;
-          t "ignore_symlinks" test_ignore_symlinks;
-          ,t "symlink_as_root" test_symlink_as_root;
+          t "regular_file_as_root" (test_regular_file_as_root caps);
+          t "empty_dir_as_root" (test_empty_dir_as_root caps);
+          t "regular_files" (test_regular_files caps);
+          t "symlinks" (test_symlinks caps);
+          t "ignore_symlinks" (test_ignore_symlinks caps);
+          t "symlink_as_root" (test_symlink_as_root caps);
         ];
     ]
diff --git libs/paths/Unit_list_files.mli libs/paths/Unit_list_files.mli
index bb8a0f6b60f0..600237b141ba 100644
--- libs/paths/Unit_list_files.mli
+++ libs/paths/Unit_list_files.mli
@@ -2,4 +2,4 @@
    Unit tests for the List_file module
 *)
 
-val tests : Testo.t list
+val tests : < Cap.readdir ; .. > -> Testo.t list
diff --git libs/testo libs/testo
index 8cc13a965237..324883471634 160000
--- libs/testo
+++ libs/testo
@@ -1 +1 @@
-Subproject commit 8cc13a9652371453cb7cbd752798f9d1e73b3a18
+Subproject commit 324883471634eba92b1d9608b298991bee3d67b4
diff --git metrics.md metrics.md
index d33c5d27468a..8d30695ff128 100644
--- metrics.md
+++ metrics.md
@@ -379,6 +379,7 @@ dependencies data are:
 - Package name (e.g., lodash)
 - Package version (e.g., 1.2.3)
 - File path for lockfile (e.g., frontend/yarn.lock)
+- Analysis of external dependency calls. (e.g., from flask import Response, Response(status=204))
 
 ## Debugging data collected when traces are requested
 
diff --git release_changes.md release_changes.md
index 283f464495e0..5f3c20d50aff 100644
--- release_changes.md
+++ release_changes.md
@@ -1,30 +1,27 @@
-## [1.107.0](https://github.com/semgrep/semgrep/releases/tag/v1.107.0) - 2025-02-04
+## [1.108.0](https://github.com/semgrep/semgrep/releases/tag/v1.108.0) - 2025-02-12
 
 
 ### Added
 
 
-- More testing of pnpm-lock.yaml dependency parsing. (gh-2999)
-- Added a progress indicator during dependency resolution for supply chain scans. (sc-2045)
+- pro: Semgrep can now dynamically resolve dependencies for Python projects using pip, allowing it to determine transitive dependencies automatically. (sc-2069)
 
 
-### Fixed
+### Changed
 
 
-- The pro engine now respects the correct order of field resolution in Scala's
-  multiple inheritance. The type that appears later takes precedence when
-  resolving fields. For example, in `class A extends B with C with D`, the order
-  of precedence is D, C, B, and A. (code-7891)
-- pro: taint: Fixed bug in callback support, see https://semgrep.dev/playground/s/oqobX (code-7976)
-- pro: python: Fixed resolution of calls to the implementation of abstract methods.
-  See https://semgrep.dev/playground/s/X5kZ4. (code-7987)
-- Fixed the semgrep ci --help to not include experimental options
-  like --semgrep-branch (saf-1746)
-- Peer dependency relationships in package-lock.json files are tracked when parsing a dependency graph (sc-2032)
-- Peer dependency relationships in pnpm-lock.yaml files are tracked when parsing a dependency graph (sc-2033)
+- Bump base Alpine docker image from 3.19 to 3.21. (alpine-version)
+- The semgrep-appsec-platform specific metadata fields "semgrep.dev:" and
+  "semgrep.policy:" are now filtered from the JSON output unless you
+  are logged in with the Semgrep appsec platform.
+  See https://semgrep.dev/docs/semgrep-appsec-platform/json-and-sarif#json for more information. (metadata-filter)
+- The Semgrep Docker image now uses Python 3.12 (bumped from 3.11). (python-version)
 
 
-### Infra/Release Changes
+### Fixed
 
 
-- Upgrade from OCaml 4.14.0 to OCaml 5.2.1 for our Docker images (ocaml5-docker)
+- This PR changes the way we handle failures in `git worktree remove` more gracefully.
+  Instead of erroring, we continue to scan so that the user can still get results, but
+  log the error. It also adds a guard so that this failure is less likely to happen
+  and will include more debugging information when it does. (sms-521)
diff --git scripts/run-core-test scripts/run-core-test
index 1d8f07e96786..82ed42a8472c 100755
--- scripts/run-core-test
+++ scripts/run-core-test
@@ -20,8 +20,7 @@ NC='\033[0m' # No Color
 # './test status' and './test appr,ove'. See './test --help' for options.
 # This uses the in-house project testo.
 #
-# TODO: fix concurrency bugs to remove '-j0'
-./test -j0
+./test
 
 # run inline tests
 # TODO: make them part of the testo test program above.
diff --git scripts/run-coverage.py scripts/run-coverage.py
index 46edc43b3640..641a13b6da67 100755
--- scripts/run-coverage.py
+++ scripts/run-coverage.py
@@ -1,7 +1,12 @@
 #!/usr/bin/python3
+#
+# TODO: CONTEXT NEEDED
+# TODO: What/who uses this script?
+#
 import os
 import re
 
+
 path = "_build/default/tests"
 
 print("Coverage statistics for files under matching/")
diff --git semgrep.nix semgrep.nix
index 801a79079c75..78391836af4b 100644
--- semgrep.nix
+++ semgrep.nix
@@ -49,7 +49,11 @@ let
       , overlays ? [ patchesOverlay on.defaultOverlay ], inputs ? [ ] }:
       let
         # Force ocaml version
-        baseQuery = { ocaml-base-compiler = ocamlVersion; };
+        baseQuery = {
+          ocaml-base-compiler = ocamlVersion;
+          # https://github.com/tweag/opam-nix/issues/112
+          ocamlfind = "1.9.6";
+        };
         repos = [ "${opam-repository}" ];
         # repos = opamRepos to force newest version of opam
         # pkgs = pkgs to force newest version of nixpkgs instead of using opam-nix's
diff --git semgrep.opam semgrep.opam
index 85103f277763..33426b314bd6 100644
--- semgrep.opam
+++ semgrep.opam
@@ -1,6 +1,6 @@
 # This file is generated by dune, edit dune-project instead
 opam-version: "2.0"
-version: "1.107.0"
+version: "1.108.0"
 synopsis:
   "Like grep but for code: fast and syntax-aware semantic code pattern for many languages"
 description: """
@@ -64,7 +64,7 @@ depends: [
   "ocurl" {= "0.9.1"}
   "opentelemetry-client-ocurl"
   "ambient-context-lwt"
-  "conf-libcurl" {= "1"}
+  "conf-libcurl"
   "uri"
   "uuidm" {>= "0.9.9"}
   "cohttp" {= "6.0.0"}
diff --git setup.py setup.py
index 6bb33d9ef259..a9e1631abf0f 100644
--- setup.py
+++ setup.py
@@ -5,7 +5,7 @@
 
 setup(
     name="semgrep_pre_commit_package",
-    version="1.107.0",
-    install_requires=["semgrep==1.107.0"],
+    version="1.108.0",
+    install_requires=["semgrep==1.108.0"],
     packages=[],
 )
diff --git src/core/Core_result.ml src/core/Core_result.ml
index 10e07d0238d9..f8cd8dd9acb6 100644
--- src/core/Core_result.ml
+++ src/core/Core_result.ml
@@ -115,6 +115,8 @@ type t = {
   explanations : Matching_explanation.t list option;
   rules_by_engine : (Rule_ID.t * Engine_kind.t) list;
   interfile_languages_used : Analyzer.t list;
+  (* extra information *)
+  symbol_analysis : Semgrep_output_v1_t.symbol_analysis option;
 }
 [@@deriving show]
 
@@ -148,6 +150,7 @@ let mk_result_with_just_errors (errors : Core_error.t list) : t =
     explanations = None;
     rules_by_engine = [];
     interfile_languages_used = [];
+    symbol_analysis = None;
   }
 
 (* Create a match result *)
@@ -316,4 +319,5 @@ let mk_result (results : Core_profiling.file_profiling match_result list)
     rules_by_engine =
       rules_with_engine |> List_.map (fun (r, ek) -> (fst r.Rule.id, ek));
     interfile_languages_used;
+    symbol_analysis = None;
   }
diff --git src/core/Core_result.mli src/core/Core_result.mli
index 9455499d1772..9c98a1117e15 100644
--- src/core/Core_result.mli
+++ src/core/Core_result.mli
@@ -35,6 +35,12 @@ type t = {
   explanations : Matching_explanation.t list option;
   rules_by_engine : (Rule_ID.t * Engine_kind.t) list;
   interfile_languages_used : Analyzer.t list;
+  (* Scan-adjacent information optionally collected to enable
+     SSC features.
+     This information must be collected here, at the point that we
+     return from the core engine.
+  *)
+  symbol_analysis : Semgrep_output_v1_t.symbol_analysis option;
 }
 [@@deriving show]
 
diff --git src/core/SCA_match.ml src/core/SCA_match.ml
index 3dafe6c6ddce..99e0bfc210d7 100644
--- src/core/SCA_match.ml
+++ src/core/SCA_match.ml
@@ -6,6 +6,7 @@ type t = {
   kind : kind;
 }
 
+(* TODO: reuse the new Out.sca_match_kind *)
 and kind =
   (* Rule had both code patterns and dependency patterns, got matches on *both*,
    * the ,Pattern Match is in code, annotated with this dependency match *)
diff --git src/core/Test_tags.ml src/core/Test_tags.ml
index 1f19f1446d8e..3ef6e1e80ad6 100644
--- src/core/Test_tags.ml
+++ src/core/Test_tags.ml
@@ -9,6 +9,9 @@ open Common
 (* Should this be "js.todo"? Feel free to change it. *)
 let todo_js = Testo.Tag.declare "todo.js"
 
+(* A test that sometimes fails for unknown reasons *)
+let flaky = Testo.Tag.declare "flaky"
+
 (* "lang.none" would be shorter but possibly confusing since we're using
    the term "generic" everywhere. *)
 let lang_generic = Testo.Tag.declare "lang.generic"
diff --git src/core/Test_tags.mli src/core/Test_tags.mli
index 46f52eddfe66..fee207fc0cf0 100644
--- src/core/Test_tags.mli
+++ src/core/Test_tags.mli
@@ -4,6 +4,9 @@
 
 val todo_js : Testo.Tag.t
 
+(* A test that sometimes fails for unknown reasons *)
+val flaky : Testo.Tag.t
+
 (* This is used to exclude all the tests involving this or that language. *)
 val tags_of_lang : Lang.t -> Testo.Tag.t list
 val tags_of_langs : Lang.t list -> Testo.Tag.t list
diff --git src/core/Version.ml src/core/Version.ml
index fda074b6b3fa..b8030b7656d1 100644
--- src/core/Version.ml
+++ src/core/Version.ml
@@ -3,4 +3,4 @@
 
   Automatically modified by scripts/release/bump.
 *)
-let version = "1.107.0"
+let version = "1.108.0"
diff --git src/core_cli/Core_CLI.ml src/core_cli/Core_CLI.ml
index 9620365b14a9..bf64a838f604 100644
--- src/core_cli/Core_CLI.ml
+++ src/core_cli/Core_CLI.ml
@@ -115,6 +115,12 @@ let ncores = ref Core_scan_config.default.ncores
 let filter_irrelevant_rules =
   ref Core_scan_config.default.filter_irrelevant_rules
 
+(* ------------------------------------------------------------------------- *)
+(* scan-adjacent information *)
+(* ------------------------------------------------------------------------- *)
+
+let symbol_analysis = ref Core_scan_config.default.symbol_analysis
+
 (* ------------------------------------------------------------------------- *)
 (* pad's action flag *)
 (* ------------------------------------------------------------------------- *)
@@ -359,6 +365,8 @@ let mk_config () : Core_scan_config.t =
                  without -trace.");
           None
       | false, None -> None);
+    (* only settable via the Pro binary *)
+    symbol_analysis = !symbol_analysis;
   }
 
 (*****************************************************************************)
@@ -382,7 +390,7 @@ let all_actions (caps : Cap.all_caps) () =
       " <files or dirs> generate parsing statistics (use -json for JSON output)",
       Arg_.mk_action_n_arg (fun xs ->
           Test_parsing.parsing_stats
-            (caps :> < Cap.time_limit ; Cap.memory_limit >)
+            (caps :> < Cap.time_limit ; Cap.memory_limit ; Cap.readdir >)
             (Lang.of_opt_exn !lang)
             ~json:
               (match !output_format with
@@ -462,8 +470,9 @@ let all_actions (caps : Cap.all_caps) () =
     ( "-test_parse_tree_sitter",
       " <dir> test tree-sitter parser on target files",
       Arg_.mk_action_1_arg (fun root ->
-          Test_parsing.test_parse_tree_sitter (Lang.of_opt_exn !lang)
-            (Fpath.v root)) );
+          Test_parsing.test_parse_tree_sitter
+            (caps :> < Cap.readdir >)
+            (Lang.of_opt_exn !lang) (Fpath.v root)) );
     ( "-translate_rules",
       " <files or dirs>",
       Arg_.mk_action_n_conv Fpath.v
@@ -474,7 +483,8 @@ let all_actions (caps : Cap.all_caps) () =
         (Check_rule.stat_files (caps :> < Cap.stdout >)) );
     ( "-parse_rules",
       " <dir>",
-      Arg_.mk_action_1_conv Fpath.v Test_parsing.test_parse_rules );
+      Arg_.mk_action_1_conv Fpath.v
+        (Test_parsing.test_parse_rules (caps :> < Cap.readdir >)) );
     ("-test_eval", " <JSON file>", Arg_.mk_action_1_arg Eval_generic.test_eval);
     ( "-sarif_sort",
       " <JSON file>",
@@ -621,7 +631,7 @@ let options caps (actions : unit -> Arg_.cmdline_actions) =
       ( "-rpc",
         Arg.Unit
           (fun () ->
-            RPC.main (caps :> < Cap.exec ; Cap.tmp >);
+            RPC.main (caps :> < ,Cap.exec ; Cap.tmp ; Cap.network >);
             Core_exit_code.(exit_semgrep caps#exit Success)),
         " don't use this unless you already know" );
     ]
diff --git src/core_cli/Core_CLI.mli src/core_cli/Core_CLI.mli
index 1576d2318551..a016a49e27d2 100644
--- src/core_cli/Core_CLI.mli
+++ src/core_cli/Core_CLI.mli
@@ -9,6 +9,7 @@ val profile : bool ref
 val log_to_file : Fpath.t option ref
 val trace : bool ref
 val env_extra : string
+val symbol_analysis : bool ref
 
 (* compute Core_scan_config.t given command-line flags *)
 val mk_config : unit -> Core_scan_config.t
@@ -24,7 +25,7 @@ val output_core_results :
 *)
 
 val options :
-  < Cap.exec ; Cap.exit ; Cap.stdout ; Cap.tmp ; .. > ->
+  < Cap.exec ; Cap.exit ; Cap.stdout ; Cap.tmp ; Cap.network ; .. > ->
   (unit -> Arg_.action_spec list) ->
   Arg_.cmdline_options
 
diff --git src/core_scan/Core_scan.ml src/core_scan/Core_scan.ml
index f0e954a2a379..dbd2ced928a3 100644
--- src/core_scan/Core_scan.ml
+++ src/core_scan/Core_scan.ml
@@ -353,8 +353,9 @@ let targets_of_config (config : Core_scan_config.t) (rules : Rule.t list) :
           let targeting_conf =
             translate_targeting_conf_from_pysemgrep targeting_conf
           in
+          let caps = Cap.readdir_UNSAFE () in
           let target_paths, errors, skipped =
-            Find_targets.get_target_fpaths targeting_conf scanning_roots
+            Find_targets.get_target_fpaths caps targeting_conf scanning_roots
           in
           let targets =
             Core_targeting.targets_for_files_and_rules target_paths rules
diff --git src/core_scan/Core_scan_config.ml src/core_scan/Core_scan_config.ml
index 77903f1973b4..5a158d63a758 100644
--- src/core_scan/Core_scan_config.ml
+++ src/core_scan/Core_scan_config.ml
@@ -77,6 +77,7 @@ type t = {
   filter_irrelevant_rules : bool;
   (* telemetry *)
   tracing : Tracing.config option;
+  symbol_analysis : bool;
 }
 [@@deriving show]
 
@@ -116,4 +117,5 @@ let default =
     filter_irrelevant_rules = true;
     (* debugging and telemetry flags *)
     tracing = None;
+    symbol_analysis = false;
   }
diff --git src/core_scan/Core_targeting.mli src/core_scan/Core_targeting.mli
index f0704e1569a7..b90905e31e8b 100644
--- src/core_scan/Core_targeting.mli
+++ src/core_scan/Core_targeting.mli
@@ -6,7 +6,7 @@
    by the legacy semgrep-core input interface.
 *)
 
-(* reused in semgrep-server in pro and for Git_remote.ml in pro *)
+(* reused for Git_remote.ml in pro *)
 val split_jobs_by_language :
   Find_targets.conf -> Rule.t list -> Fpath.t list -> Lang_job.t list
 
diff --git src/engine/tests/Test_engine.ml src/engine/tests/Test_engine.ml
index 9d34b10ef1d9..c6f0cf06f719 100644
--- src/engine/tests/Test_engine.ml
+++ src/engine/tests/Test_engine.ml
@@ -91,9 +91,11 @@ let xtarget_of_file (analyzer : Analyzer.t) (target : Fpath.t) : Xtarget.t =
 (* target helpers *)
 (*****************************************************************************)
 
-let find_target_of_yaml_file_opt (file : Fpath.t) : Fpath.t option =
+let find_target_of_yaml_file_opt (caps : < Cap.readdir ; .. >) (file : Fpath.t)
+    : Fpath.t option =
   let d, b, ext = Filename_.dbe_of_filename !!file in
-  Common2.readdir_to_file_list d @ Common2.readdir_to_link_list d
+  let entries = CapFS.read_dir_entries caps (Fpath.v d) in
+  entries
   |> List_.find_some_opt (fun file2 ->
          let path2 = Filename.concat d file2 in
          (* Config files have a single .yaml extension (assumption),
@@ -116,8 +118,8 @@ let find_target_of_yaml_file_opt (file : Fpath.t) : Fpath.t option =
              then Some (Fpath.v path2)
              else None)
 
-let find_target_of_yaml_file (file : Fpath.t) : Fpath.t =
-  match find_target_of_yaml_file_opt file with
+let find_target_of_yaml_file caps (file : Fpath.t) : Fpath.t =
+  match find_target_of_yaml_file_opt caps file with
   | Some x -> x
   | None -> failwith (spf "could not find a target for %s" !!file)
 
@@ -183,7 +185,8 @@ let check_parse_errors (rule_file : Fpath.t) (errors : Core_error.ErrorSet.t) :
 (* Main logic *)
 (*********,********************************************************************)
 
-let read_rules_file ~get_analyzer ?fail_callback rule_file =
+let read_rules_file ~get_analyzer ?fail_callback (caps : < Cap.readdir ; .. >)
+    rule_file =
   match Parse_rule.parse rule_file with
   (* TODO: fail better with invalid rules? *)
   | Error _ -> None
@@ -199,7 +202,7 @@ let read_rules_file ~get_analyzer ?fail_callback rule_file =
       None
   | Ok rules ->
       let analyzer = get_analyzer rule_file rules in
-      let target = find_target_of_yaml_file rule_file in
+      let target = find_target_of_yaml_file caps rule_file in
       Logs.info (fun m ->
           m "processing target %s (with analyzer %s)" !!target
             (Analyzer.to_string analyzer));
@@ -217,10 +220,10 @@ let read_rules_file ~get_analyzer ?fail_callback rule_file =
 
 let make_test_rule_file ?(fail_callback = fun _i m -> Alcotest.fail m)
     ?(get_analyzer = single_analyzer_from_rules) ?(prepend_lang = false)
-    (rule_file : Fpath.t) : Testo.t =
+    (caps : < Cap.readdir ; .. >) (rule_file : Fpath.t) : Testo.t =
   let test () =
     Logs.info (fun m -> m "processing rules  %s" !!rule_file);
-    match read_rules_file ~get_analyzer ~fail_callback rule_file with
+    match read_rules_file ~get_analyzer ~fail_callback caps rule_file with
     | None -> ()
     | Some (rules, target, analyzer) -> (
         (* expected *)
@@ -274,7 +277,7 @@ let make_test_rule_file ?(fail_callback = fun _i m -> Alcotest.fail m)
   in
 
   (* end of let test () *)
-  match find_target_of_yaml_file_opt rule_file with
+  match find_target_of_yaml_file_opt caps rule_file with
   | Some target_path ->
       (* This assumes we can guess the target programming language
          from the file extension. *)
@@ -299,16 +302,17 @@ let find_rule_files roots =
  * (or wait that we switch to osemgrep test for our own test infra in which
  * case this whole file will be deleted)
  *)
-let collect_tests ?(get_analyzer = single_analyzer_from_rules)
+let collect_tests ?(get_analyzer = single_analyzer_from_rules) caps
     (xs : Fpath.t list) =
   xs |> find_rule_files
   |> List_.filter_map (fun rule_file ->
          let* _rules, target, analyzer =
-           read_rules_file ~get_analyzer rule_file
+           read_rules_file ~get_analyzer caps rule_file
          in
          Some (rule_file, target, analyzer))
 
-let make_tests ?fail_callback ?get_analyzer ?prepend_lang (xs : Fpath.t list) :
-    Testo.t list =
+let make_tests ?fail_callback ?get_analyzer ?prepend_lang caps
+    (xs : Fpath.t list) : Testo.t list =
   xs |> find_rule_files
-  |> List_.map (make_test_rule_file ?fail_callback ?get_analyzer ?prepend_lang)
+  |> List_.map
+       (make_test_rule_file ?fail_callback ?get_analyzer ?prepend_lang caps)
diff --git src/engine/tests/Test_engine.mli src/engine/tests/Test_engine.mli
index 857355683512..17d301b8191c 100644
--- src/engine/tests/Test_engine.mli
+++ src/engine/tests/Test_engine.mli
@@ -11,19 +11,23 @@ val make_tests :
   ?get_analyzer:(Fpath.t -> Rule.rules -> Analyzer.t) ->
   (* default to false *)
   ?prepend_lang:bool ->
+  < Cap.readdir ; .. > ->
   Fpath.t list ->
   Testo.t list
 
 (* For Pro tests *)
 val collect_tests :
   ?get_analyzer:(Fpath.t -> Rule.rules -> Analyzer.t) ->
+  < Cap.readdir ; .. > ->
   Fpath.t list (* roots *) ->
   (Fpath.t (* rule file *) * Fpath.t (* target file *) * Analyzer.t) list
 
 (* helpers used in Test_subcommand.ml
  * TODO? Move in Rule_tests.mli?
  *)
-val find_target_of_yaml_file_opt : Fpath.t -> Fpath.t option
+val find_target_of_yaml_file_opt :
+  < Cap.readdir ; .. > -> Fpath.t -> Fpath.t option
+
 val analyzers_of_rules : Rule.t list -> Analyzer.t list
 val first_analyzer_of_rules : Rule.t list -> Analyzer.t
 val xtarget_of_file : Analyzer.t -> Fpath.t -> Xtarget.t
diff --git src/engine/tests/Unit_engine.ml src/engine/tests/Unit_engine.ml
index a59ead9317b3..2a5f7438a8d8 100644
--- src/engine/tests/Unit_engine.ml
+++ src/engine/tests/Unit_engine.ml
@@ -763,11 +763,11 @@ let lang_tainting_tests () =
 (* DEPRECATED: th,is is redundant because we now have 'make rules-test'
  * which calls 'osemgrep-pro test --pro tests/rules tests/rules_v2'
  *)
-let full_rule_regression_tests () =
+let full_rule_regression_tests (caps : < Cap.readdir ; .. >) =
   let path = tests_path / "rules" in
-  let tests1 = Test_engine.make_tests ~prepend_lang:true [ path ] in
+  let tests1 = Test_engine.make_tests ~prepend_lang:true caps [ path ] in
   let path = tests_path / "rules_v2" in
-  let tests2 = Test_engine.make_tests ~prepend_lang:true [ path ] in
+  let tests2 = Test_engine.make_tests ~prepend_lang:true caps [ path ] in
   let tests = tests1 @ tests2 in
   let groups =
     tests
@@ -795,9 +795,9 @@ let full_rule_regression_tests () =
  * DEPRECATED: this is redundant because we now have 'make rules-test'
  * which calls 'osemgrep-pro test --pro tests/taint_maturity'
  *)
-let full_rule_taint_maturity_tests () =
+let full_rule_taint_maturity_tests caps =
   let path = tests_path / "taint_maturity" in
-  Testo.categorize "taint maturity" (Test_engine.make_tests [ path ])
+  Testo.categorize "taint maturity" (Test_engine.make_tests caps [ path ])
 
 (*
    Special exclusions for Semgrep JS
@@ -825,9 +825,9 @@ let mark_todo_js (test : Testo.t) =
  * DEPRECATED: this is redundant because we now have 'make rules-test'
  * which calls 'osemgrep-pro test --pro tests/semgrep-rules'
  *)
-let semgrep_rules_repo_tests () : Testo.t list =
+let semgrep_rules_repo_tests caps : Testo.t list =
   let path = tests_path / "semgrep-rules" in
-  let tests = Test_engine.make_tests [ path ] in
+  let tests = Test_engine.make_tests caps [ path ] in
   let groups =
     tests
     |> List_.filter_map (fun (test : Testo.t) ->
@@ -884,7 +884,7 @@ let semgrep_rules_repo_tests () : Testo.t list =
                  Some (String.capitalize_ascii s)
              (* this skips the semgrep-rules/.github entries *)
              | _ ->
-                 Logs.info (fun m -> m "skipping %s" test.name);
+                 Logs.debug (fun m -> m "skipping %s" test.name);
                  None
            in
            group_opt |> Option.map (fun groupname -> (groupname, test)))
@@ -912,7 +912,7 @@ let semgrep_rules_repo_tests () : Testo.t list =
 (* All tests *)
 (*****************************************************************************)
 
-let tests () =
+let tests (caps : < Cap.readdir ; .. >) =
   List_.flatten
     [
       (* full testing for many languages *)
@@ -922,7 +922,7 @@ let tests () =
       filter_irrelevant_rules_tests ();
       lang_tainting_tests ();
       maturity_tests ();
-      full_rule_taint_maturity_tests ();
-      full_rule_regression_tests ();
-      semgrep_rules_repo_tests ();
+      full_rule_taint_maturity_tests caps;
+      full_rule_regression_tests caps;
+      semgrep_rules_repo_tests caps;
     ]
diff --git src/engine/tests/Unit_engine.mli src/engine/tests/Unit_engine.mli
index 0444a666b57b..8e5cb38b8d2d 100644
--- src/engine/tests/Unit_engine.mli
+++ src/engine/tests/Unit_engine.mli
@@ -3,7 +3,7 @@
    to the current location. Having them created on demand allows running
    'dune utop' from any location.
 *)
-val tests : unit -> Testo.t list
+val tests : < Cap.readdir ; .. > -> Testo.t list
 
 type fix_type =
   | Fix of string
diff --git src/lsp/Test_LS_e2e.ml src/lsp/Test_LS_e2e.ml
index 25b44ef371d6..d56b33b92918 100644
--- src/lsp/Test_LS_e2e.ml
+++ src/lsp/Test_LS_e2e.ml
@@ -1061,7 +1061,7 @@ let test_ls_multi caps () =
       ignore info;
       Lwt.return_unit)
 
-let _test_login caps () =
+let test_login caps () =
   with_session caps (fun { server = info; root } ->
       (* If we don't log out prior to starting this test, the LS will complain
          we're already logged in, and not display the correct behavior.
@@ -1238,19 +1238,39 @@ let sync f () = Lwt_platform.run (f ())
 
 (* Create an lwt test and a synchronous test right away because we run
    both and it's hard to convert from one to the other. *)
-let pair ?tolerate_chdir name func =
-  ( Testo.create ?tolerate_chdir name (sync func),
-    Testo_lwt.create ?tolerate_,chdir name func )
+let pair ?expected_outcome ?skipped ?tags ?tolerate_chdir name func =
+  let expected_outcome_lwt : Testo_lwt.expected_outcome option =
+    (* ugh: Testo.expected_outcome and Testo_lwt.expected_outcome
+       are considered different types so we need to do this translation *)
+    match (expected_outcome : Testo.expected_outcome option) with
+    | None -> None
+    | Some (Should_fail msg) -> Some (Should_fail msg)
+    | Some Should_succeed -> Some Should_succeed
+  in
+  (* We prevent runs in parallel with other tests because the test 'LS specs'
+     sometimes fails.
+     TODO: make sure the tests can run in parallel and remove '~solo'. *)
+  let solo = "possibly flaky when run in parallel" in
+  ( Testo.create ?tags ?expected_outcome ?skipped ~solo ?tolerate_chdir name
+      (sync func),
+    Testo_lwt.create ?tags ?expected_outcome:expected_outcome_lwt ?skipped ~solo
+      ?tolerate_chdir name func )
 
 let promise_tests caps =
   [
-    pair "LS specs" (test_ls_specs caps) ~tolerate_chdir:true;
+    pair "LS specs" (test_ls_specs caps) ~tags:[ Test_tags.flaky ]
+      ~tolerate_chdir:true;
     (* Keep this test commented out while it is xfail.
         Because logging in is side-effecting, if the test never completes, we
         will stay log in, which can mangle some of the later tests.
-       Test_lwt.create "Test LS login" (test_login caps)
-       ~expected_outcome:
-         (Should_fail "TODO: currently failing in js tests in CI"); *)
+    *)
+    pair "Test LS login" (test_login caps)
+      ~skipped:
+        {|Keep this test commented out while it is xfail.
+Because logging in is side-effecting, if the test never completes, we
+will stay log in, which can mangle some of the later tests.|}
+      ~expected_outcome:
+        (Should_fail "TODO: currently failing in js tests in CI");
     pair "LS /semgrep/search includes/excludes"
       (test_search_includes_excludes caps)
       ~tolerate_chdir:true;
diff --git src/lsp/Unit_LS.ml src/lsp/Unit_LS.ml
index 85ccf8b7a84e..3251d4e71be7 100644
--- src/lsp/Unit_LS.ml
+++ src/lsp/Unit_LS.ml
@@ -97,6 +97,7 @@ let mock_run_results (files : string list) : Core_runner.result =
       (* If the engine requested is OSS, there must be no
          interfile requested languages *)
       interfile_languages_used = Some [];
+      symbol_analysis = None;
     }
   in
   Core_runner_result.{ core; hrules; scanned }
diff --git src/lsp/server/Session.ml src/lsp/server/Session.ml
index a23fac0ed5b7..51b4db4409c0 100644
--- src/lsp/server/Session.ml
+++ src/lsp/server/Session.ml
@@ -38,14 +38,8 @@ let scan_config_parser_ref = ref OutJ.scan_config_of_string
 (* Types *)
 (*****************************************************************************)
 
-(* =~ Core_scan.caps + random + network + tmp *)
 type caps =
-  < Cap.random
-  ; Cap.network
-  ; Cap.tmp
-  ; Cap.fork
-  ; Cap.time_limit
-  ; Cap.memory_limit >
+  < Core_scan.caps ; Cap.random ; Cap.network ; Cap.tmp ; Cap.readdir >
 
 (* We really don't want mutable state in the server.
    This is the only exception since this stuff requires network requests that
@@ -133,13 +127,13 @@ let decode_rules caps data =
           (* There shouldn't be any errors, because we got these rules from CI. *)
           failwith "impossible: received invalid rules from Deployment")
 
-let get_targets session (root : Fpath.t) =
+let get_targets (session : t) (root : Fpath.t) =
   let targets_conf =
     User_settings.find_targets_conf_of_t session.user_settings
   in
   let proj_root = Rfpath.of_fpath_exn root in
   let targets, _errors, _skipped_targets =
-    Find_targets.get_target_fpaths
+    Find_targets.get_target_fpaths session.caps
       {
         targets_conf with
         force_project_root = Some (Find_targets.Filesystem proj_root);
diff --git src/lsp/server/Session.mli src/lsp/server/Session.mli
index adf50be41dd1..99d2da96f768 100644
--- src/lsp/server/Session.mli
+++ src/lsp/server/Session.mli
@@ -5,14 +5,8 @@ val scan_config_parser_ref : (string -> Semgrep_output_v1_t.scan_config) ref
 (** [sc,an_config_parser_ref] is a reference to a function that parses a scan
     config from a string *)
 
-(* =~ Core_scan.caps + random + network + tmp *)
 type caps =
-  < Cap.random
-  ; Cap.network
-  ; Cap.tmp
-  ; Cap.fork
-  ; Cap.time_limit
-  ; Cap.memory_limit >
+  < Core_scan.caps ; Cap.random ; Cap.network ; Cap.tmp ; Cap.readdir >
 
 type session_cache = {
   mutable rules : Rule.t list;
diff --git src/lsp/server/User_settings.ml src/lsp/server/User_settings.ml
index fe85760bd504..113773096dfa 100644
--- src/lsp/server/User_settings.ml
+++ src/lsp/server/User_settings.ml
@@ -72,4 +72,5 @@ let core_runner_conf_of_t settings : Core_runner.conf =
       strict = false;
       matching_explanations = false;
       time_flag = false;
+      symbol_analysis = false;
     }
diff --git src/metachecking/Test_metachecking.ml src/metachecking/Test_metachecking.ml
index 86c079d1a459..855e4286100f 100644
--- src/metachecking/Test_metachecking.ml
+++ src/metachecking/Test_metachecking.ml
@@ -31,7 +31,8 @@ module TCM = Test_compare_matches
 (* Entry point *)
 (*****************************************************************************)
 
-let test_rules ?(unit_testing = false) (caps : Core_scan.caps) xs =
+let test_rules ?(unit_testing = false)
+    (caps : < Core_scan.caps ; Cap.readdir ; .. >) xs =
   let fullxs =
     xs
     |> File_type.files_of_dirs_or_files (function
@@ -56,7 +57,8 @@ let test_rules ?(unit_testing = false) (caps : Core_scan.caps) xs =
          let target =
            try
              let d, b, ext = Filename_.dbe_of_filename !!file in
-             Common2.readdir_to_file_list d @ Common2.readdir_to_link_list d
+             let entries = CapFS.read_dir_entries caps (Fpath.v d) in
+             entries
              |> List_.find_some (fun file2 ->
                     let path2 = Filename.concat d file2 |> Fpath.v in
                     (* Config files have a single .yaml extension (assumption),
diff --git src/metachecking/Test_metachecking.mli src/metachecking/Test_metachecking.mli
index ac9b07c3d4be..923a57bc16ff 100644
--- src/metachecking/Test_metachecking.mli
+++ src/metachecking/Test_metachecking.mli
@@ -1 +1,5 @@
-val test_rules : ?unit_testing:bool -> Core_scan.caps -> Fpath.t list -> unit
+val test_rules :
+  ?unit_testing:bool ->
+  < Core_scan.caps ; Cap.readdir ; .. > ->
+  Fpath.t list ->
+  unit
diff --git src/metachecking/Unit_metachecking.mli src/metachecking/Unit_metachecking.mli
index 82f979253ac6..093cde209bf3 100644
--- src/metachecking/Unit_metachecking.mli
+++ src/metachecking/Unit_metachecking.mli
@@ -1 +1 @@
-val tests : Core_scan.caps -> Testo.t list
+val tests : < Core_scan.caps ; Cap.readdir ; .. > -> Testo.t list
diff --git src/osemgrep/cli/CLI.ml src/osemgrep/cli/CLI.ml
index ae91f7f7461a..b9af1122046c 100644
--- src/osemgrep/cli/CLI.ml
+++ src/osemgrep/cli/CLI.ml
@@ -43,6 +43,7 @@ type caps =
   ; Cap.random
   ; Cap.signal
   ; Cap.tmp
+  ; Cap.readdir
   ; Cap.chdir
   ; Cap.fork
   ; Cap.time_limit
@@ -56,14 +57,15 @@ let default_subcommand = "scan"
 (* alt: define our own Pro_CLI.ml in semgrep-pro
  * old: was Interactive_subcommand.main
  *)
-let hook_semgrep_interactive : (string array -> Exit_code.t) Hook.t =
-  Hook.create (fun _argv ->
+let hook_semgrep_interactive :
+    (< Cap.readdir > -> string array -> Exit_code.t) Hook.t =
+  Hook.create (fun _caps _argv ->
       failwith "semgrep interactive not available (requires semgrep pro)")
 
 let hook_semgrep_publish :
     (< Cap.stdout ; Cap.network ; Cap.tmp > -> string array -> Exit_code.t)
     Hook.t =
-  Hook.create (fun _argv ->
+  Hook.create (fun _caps _argv ->
       failwith "semgrep publsh not available (requires semgrep pro)")
 
 let hook_semgrep_show : (caps -> string array -> Exit_code.t) Hook.t =
@@ -213,7 +215,10 @@ let dispatch_subcommand (caps : caps) (argv : string array) =
         | "logout" ->
             Logout_subcommand.main (caps :> < Cap.stdout >) subcmd_argv
         | "install-ci" -> Install_ci_subcommand.main caps subcmd_argv
-        | "interactive" -> (Hook.get hook_semgrep_interactive) subcmd_argv
+        | "interactive" ->
+            (Hook.get hook_semgrep_interactive)
+              (caps :> < Cap.readdir >)
+              subcmd_argv
         | "show" -> (Hook.get hook_semgrep_show) caps subcmd_argv
         | "test" -> Test_subcommand.main caps subcmd_argv
         | "validate" -> Validate_subcommand.main caps subcmd_argv
diff --git src/osemgrep/cli/CLI.mli src/osemgrep/cli/CLI.mli
index 902a63d61f31..4dfe66c40232 100644
--- src/osemgrep/cli/CLI.mli
+++ src/osemgrep/cli/CLI.mli
@@ -1,5 +1,5 @@
 (* no exit, no argv
- * TODO: Cap.files_argv, Cap.domain, Cap.thread, Cap.alarm
+ * TODO: Cap.files_argv, Cap.domain, Cap.thread
  *)
 type caps =
   < Cap.stdout
@@ -8,6 +8,7 @@ type caps =
   ; Cap.random
   ; Cap.signal
   ; Cap.tmp
+  ; Cap.readdir
   ; Cap.chdir
   ; Cap.fork
   ; Cap.time_limit
@@ -27,7 +28,8 @@ type caps =
 val main : caps -> string array -> Exit_code.t
 
 (* osemgrep-pro hooks *)
-val hook_semgrep_interactive : (string array -> Exit_code.t) Hook.t
+val hook_semgrep_interactive :
+  (< Cap.readdir > -> string array -> Exit_code.t) Hook.t
 
 val hook_semgrep_publish :
   (< Cap.stdout ; Cap.network ; Cap.tmp > -> string array -> Exit_code.t) Hook.t
diff --git src/osemgrep/cli_ci/Ci_CLI.ml src/osemgrep/cli_ci/Ci_CLI.ml
index 4b07dd111af1..282e1d409834 100644
--- src/osemgrep/cli_ci/Ci_CLI.ml
+++ src/osemgrep/cli_ci/Ci_CLI.ml
@@ -232,7 +232,9 @@ let scan_subset_cmdline_term : Scan_CLI.conf Term.t =
       pro_path_sensitive rewrite_rule_ids sarif sarif_outputs
       scan_unknown_extensions secrets text text_outputs timeout
       _timeout_interfileTODO timeout_threshold trace trace_endpoint use_git
-      version_check vim vim_outputs =
+      _use_semgrepignore_v2 version_check vim vim_outputs x_trTODO =
+    (* this is just handled by psemgrep for now *)
+    ignore x_trTODO;
     let output_format : Output_format.t =
       Scan_CLI.output_format_conf ~text ~files_with_matches ~json ~emacs ~vim
         ~sarif ~gitlab_sast ~gitlab_secrets ~junit_xml
@@ -280,6 +282,8 @@ let scan_subset_cmdline_term : Scan_CLI.conf Term.t =
         strict = false;
         time_flag = false;
         matching_explanations;
+        (* coupling(symbol-analysis): this will be set later by the scan config *)
+        symbol_analysis = false;
       }
     in
     let include_ =
@@ -374,7 +378,8 @@ let scan_subset_cmdline_term : Scan_CLI.conf Term.t =
     $ SC.o_sarif_outputs $ SC.o_scan_unknown_extensions $ SC.o_secrets
     $ SC.o_text $ SC.o_text_outputs $ SC.o_timeout $ SC.o_timeout_interfile
     $ SC.o_timeout_threshold $ SC.o_trace $ SC.o_trace_endpoint $ SC.o_use_git
-    $ SC.o_version_check $ SC.o_vim $ SC.o_vim_outputs)
+    $ SC.o_use_semgrepignore_v2 $ SC.o_version_check $ SC.o_vim
+    $ SC.o_vim_outputs $ SC.o_tr)
 
 (*************************************************************************)
 (* Turn argv into conf *)
diff --git src/osemgrep/cli_ci/Ci_subcommand.ml src/osemgrep/cli_ci/Ci_subcommand.ml
index 6aa2ffa27060..c25d9e59a419 100644
--- src/osemgrep/cli_ci/Ci_subcommand.ml
+++ src/osemgrep/cli_ci/Ci_subcommand.ml
@@ -107,6 +107,7 @@ type caps =
   ; Cap.exec
   ; Cap.tmp
   ; Cap.chdir
+  ; Cap.readdir
   ; Cap.fork
   ; Cap.time_limit
   ; Cap.memory_limit >
@@ -966,6 +967,7 @@ let run_conf (caps : < caps ; .. >) (ci_conf : Ci_CLI.conf) : Exit_code.t =
         dependency_query = _;
         path_to_transitivity = _;
         scan_all_deps_in_diff_scan = _;
+        symbol_analysis;
         ignored_files = _;
         product_ignored_files = _;
         generic_slow_rollout = _;
@@ -976,6 +978,16 @@ let run_conf (caps : < caps ; .. >) (ci_conf : Ci_CLI.conf) : Exit_code.t =
     scan_response
   in
 
+  (* coupling(symbol-analysis): we should update our config with the
+     scan-relevant flags we receive from our scan config
+  *)
+  let conf =
+    {
+      conf with
+      core_runner_conf = { conf.core_runner_conf with symbol_analysis };
+    }
+  in
+
   (* TODO:
      if dataflow_traces is None:
        dataflow_traces = engine_type.has_dataflow_traces
@@ -1022,7 +1034,7 @@ let run_conf (caps : < caps ; .. >) (ci_conf : Ci_CLI.conf) : Exit_code.t =
     in
 
     let targets_and_ignored =
-      Find_targets.get_target_fpaths conf.targeting_conf [ target_root ]
+      Find_targets.get_target_fpaths caps conf.targeting_conf [ target_root ]
     in
     let res =
       Scan_subcommand.check_targets_with_rules
@@ -1041,7 +1053,7 @@ let run_conf (caps : < caps ; .. >) (ci_conf : Ci_CLI.conf) : Exit_code.t =
         app.report_failure caps' ~scan_id e;
         Logs.err (fun m -> m "Encountered error when running rules");
         e
-    | Ok (filtered_rules, _res, cli_output) ->
+    | Ok (filtered_rules, res, cli_output) ->
         (* step5: upload the findings *)
         let _cai_rules, blocking_rules, non_blocking_rules =
           partition_rules filtered_rules
@@ -1086,6 +1098,26 @@ let run_conf (caps : < caps ; .. >) (ci_conf : Ci_CLI.conf) : Exit_code.t =
           upload_findings caps' app deployment_name scan_id prj_meta
             blocking_findings filtered_rules cli_output
         in
+
+        (* Upload scan-adjacent information, such as symbol analysis
+           (needed for SSC features)
+           This will not return anything interesting, but will report its
+           status in the logs. We shouldn't let symbol analysis affect
+           the actual scan's behavior.
+        *)
+        (match res.core.symbol_analysis with
+        | None -> ()
+        | Some symbol_analysis -> (
+            match
+              Semgrep_App.upload_symbol_analysis caps' ~token:caps'#token
+                ~scan_id symbol_analysis
+            with
+            | Error msg ->
+                Logs.warn (fun m ->
+                    m "Failed to upload symbol analysis: %s" msg)
+            | Ok msg ->
+                Logs.debug (fun m ->
+                    m "Uploading symbol analysis succeeded with %s" msg)));
         let audit_mode = false in
         (* TODO: audit_mode = metadata.event_name in audit_on *)
         exit_code_of_blocking_findings ~audit_mode ~on:prj_meta.on
diff --git src/osemgrep/cli_ci/Ci_subcommand.mli src/osemgrep/cli_ci/Ci_subcommand.mli
index db67dffa3b7b..75b690ad5e4c 100644
--- src/osemgrep/cli_ci/Ci_subcommand.mli
+++ src/osemgrep/cli_ci/Ci_subcommand.mli
@@ -4,6 +4,7 @@ type caps =
   ; Cap.exec
   ; Cap.tmp
   ; Cap.chdir
+  ; Cap.readdir
   ; Cap.fork
   ; Cap.time_limit
   ; Cap.memory_limit >
diff --git src/osemgrep/cli_scan/Diff_scan.ml src/osemgrep/cli_scan/Diff_scan.ml
index 93780c78589f..113e9d713ce4 100644
--- src/osemgrep/cli_scan/Diff_scan.ml
+++ src/osemgrep/cli_scan/Diff_scan.ml
@@ -180,8 +180,9 @@ let scan_baseline_and_remove_duplicates (caps : < Cap.chdir ; Cap.tmp >)
               let baseline_targets, baseline_diff_targets =
                 match conf.engine_type with
                 | PRO Engine_type.{ analysis = Interprocedural; _ } ->
+                    let caps = Cap.readdir_UNSAFE () in
                     let all_in_baseline, _errors, _skipped =
-                      Find_targets.get_target_fpaths conf.targeting_conf
+                      Find_targets.get_target_fpaths caps conf.targeting_conf
                         conf.target_roots
                     in
                     (* Performing a scan on the same set of files for the
diff --git src/osemgrep/cli_scan/Ls_subcommand.ml src/osemgrep/cli_scan/Ls_subcommand.ml
index 6d79b66dba68..8554996c6870 100644
--- src/osemgrep/cli_scan/Ls_subcommand.ml
+++ src/osemgrep/cli_scan/Ls_subcommand.ml
@@ -13,9 +13,10 @@ type format = Paths_only | Long [@@deriving show]
 
 let default_format = Paths_only
 
-let run ~target_roots ~targeting_conf:conf ~format () =
+let run (caps : < Cap.readdir ; .. >) ~target_roots ~targeting_conf:conf ~format
+    =
   let selected, errors, skipped =
-    Find_targets.get_target_fpaths conf target_roots
+    Find_targets.get_target_fpaths caps conf target_roots
   in
   selected |> List.sort Fpath.compare
   |> List.iter (fun (x : Fpath.t) ->
diff --git src/osemgrep/cli_scan/Ls_subcommand.mli src/osemgrep/cli_scan/Ls_subcommand.mli
index f61cfd359209..36b6b88a6573 100644
--- src/osemgrep/cli_scan/Ls_subcommand.mli
+++ src/osemgrep/cli_scan/Ls_subcommand.mli
@@ -20,8 +20,8 @@ val default_format : format
    Print the list of selected targets in alphabetical order, one per line.
 *)
 val run :
+  < Cap.readdir ; .. > ->
   target_roots:Scanning_root.t list ->
   targeting_conf:Find_targets.conf ->
   format:format ->
-  unit ->
   Exit_code.t
diff --git src/osemgrep/cli_scan/Scan_CLI.ml src/osemgrep/cli_scan/Scan_CLI.ml
index 119f66603409..4663ac19812e 100644
--- src/osemgrep/cli_scan/Scan_CLI.ml
+++ src/osemgrep/cli_scan/Scan_CLI.ml
@@ -235,6 +235,16 @@ let o_use_git : bool Term.t =
         in a git repository.
         '--use-git-ignore' is semgrep's default behavior.|}
 
+(*
+   This is a temporary option that has an effect only in pysemgrep during
+   the process of migration from Python's file targeting to the OCaml
+   implementation in semgrep-core. It's only here so that we get
+   it documented in '--help'!
+*)
+let o_use_semgrepignore_v2 : bool Cmdliner.Term.t =
+  H.negatable_flag [ "semgrepignore-v2" ] ~neg_options:[ "no-semgrepignore-v2" ]
+    ~default:false ~doc:"Under development. Not currently recommended."
+
 let o_ignore_semgrepignore_files : bool Term.t =
   let info =
     Arg.info
@@ -981,6 +991,11 @@ CHANGE OR DISAPPEAR WITHOUT NOTICE.
   in
   Arg.value (Arg.flag info)
 
+(* LATER: move in SCA section with allow-local-build *)
+let o_tr : bool Term.t =
+  let info = Arg.info [ "x-tr" ] ~doc:"<internal, do not use>" in
+  Arg.value (Arg.flag info)
+
 (*****************************************************************************)
 (* Helpers *)
 (*****************************************************************************)
@@ -1091,7 +1106,8 @@ let project_root_conf ~project_root ~remote : Find_targets.project_root option =
       Some (Find_targets.Filesystem (Rfpath.of_string_exn root))
   | None, Some url when is_git_repo url ->
       (* CWD must be empty for this to work *)
-      let has_files = not (List_.null (List_files.list (Fpath.v "."))) in
+      let caps = Cap.readdir_UNSAFE () in
+      let has_files = not (List_.null (List_files.list caps (Fpath.v "."))) in
       if has_files then
         Error.abort
           "Cannot use --remote with a git remote when the current directory is \
@@ -1312,11 +1328,12 @@ let cmdline_term caps ~allow_empty_config : conf Term.t =
       sarif_outputs scan_unknown_extensions secrets severity
       show_supported_languages strict target_roots test test_ignore_todo text
       text_outputs time_flag timeout _timeout_interfileTODO timeout_threshold
-      trace trace_endpoint use_git validate version version_check vim
-      vim_outputs x_ignore_semgrepignore_files x_ls x_ls_long =
+      trace trace_endpoint use_git _use_semgrepignore_v2 validate version
+      version_check vim vim_outputs x_ignore_semgrepignore_files x_ls x_ls_long
+      x_tr =
     (* Print a warning if any of the internal or experimental options.
        We don't want users to start relying on these. *)
-    if x_ignore_semgrepignore_files || x_ls || x_ls_long then
+    if x_ignore_semgrepignore_files || x_ls || x_ls_long || x_tr then
       Logs.warn (fun m ->
           m
             "!!! You're using one or more options starting with '--x-'. These \
@@ -1396,6 +1413,10 @@ let cmdline_term caps ~allow_empty_config : conf Term.t =
         strict;
         time_flag;
         matching_explanations;
+        (* only relevant for CI scans, but must be here for when we make
+           the `Core_runner_conf.t`
+        *)
+        symbol_analysis = Core_runner.default_conf.symbol_analysis;
       }
     in
     let include_ =
@@ -1533,9 +1554,9 @@ let cmdline_term caps ~allow_empty_config : conf Term.t =
     $ o_secrets $ o_severity $ o_show_supported_languages $ o_strict
     $ o_target_roots $ o_test $ Test_CLI.o_test_ignore_todo $ o_text
     $ o_text_outputs $ o_time $ o_timeout $ o_timeout_interfile
-    $ o_timeout_threshold $ o_trace $ o_trace_endpoint $ o_use_git $ o_validate
-    $ o_version $ o_version_check $ o_vim $ o_vim_outputs
-    $ o_ignore_semgrepignore_files $ o_ls $ o_ls_long)
+    $ o_timeout_threshold $ o_trace $ o_trace_endpoint $ o_use_git
+    $ o_use_semgrepignore_v2 $ o_validate $ o_version $ o_version_check $ o_vim
+    $ o_vim_outputs $ o_ignore_semgrepignore_files $ o_ls $ o_ls_long $ o_tr)
 
 let doc = "run semgrep rules on files"
 
diff --git src/osemgrep/cli_scan/Scan_CLI.mli src/osemgrep/cli_scan/Scan_CLI.mli
index 977ff58b4352..5f2a8ce5cd51 100644
--- src/osemgrep/cli_scan/Scan_CLI.mli
+++ src/osemgrep/cli_scan/Scan_CLI.mli
@@ -127,9 +127,11 @@ val o_time : bool Cmdliner.Term.t
 val o_timeout : float Cmdliner.Term.t
 val o_timeout_interfile : int Cmdliner.Term.t
 val o_timeout_threshold : int Cmdliner.Term.t
+val o_tr : bool Cmdliner.Term.t
 val o_trace : bool Cmdliner.Term.t
 val o_trace_endpoint : string option Cmdliner.Term.t
 val o_use_git : bool Cmdliner.Term.t
+val o_use_semgrepignore_v2 : bool Cmdliner.Term.t
 val o_version_check : bool Cmdliner.Term.t
 val o_vim : bool Cmdliner.Term.t
 val o_vim_outputs : string list Cmdliner.Term.t
diff --git src/osemgrep/cli_scan/Scan_subcommand.ml src/osemgrep/cli_scan/Scan_subcommand.ml
index 4490f21062ae..18c5fe01bc5f 100644
--- src/osemgrep/cli_scan/Scan_subcommand.ml
+++ src/osemgrep/cli_scan/Scan_subcommand.ml
@@ -41,6 +41,8 @@ type caps =
      * differential scans as we use Git_wrapper.run_with_worktree.
      *)
     Cap.chdir
+  ; (* for Test_subcommand dispatch and Core_scan scanning root targeting *)
+    Cap.readdir
   ; (* for Parmap in Core_scan *)
     Cap.fork
   ; (* for Check_rules timeout *)
@@ -697,20 +699,14 @@ let run_scan_conf (caps : < caps ; .. >) (conf : Scan_CLI.conf) : Exit_code.t =
       (* step2: getting the targets *)
       Logs.info (fun m -> m "Computing the targets");
       let targets_and_skipped =
-        Find_targets.get_target_fpaths conf.targeting_conf conf.target_roots
+        Find_targets.get_target_fpaths caps conf.targeting_conf
+          conf.target_roots
       in
 
-      (* step3: let's go *)
+      (* step3: let's go (no need for network caps from now on) *)
       let res =
-        check_targets_with_rules
-          (caps
-            :> < Cap.stdout
-               ; Cap.chdir
-               ; Cap.tmp
-               ; Cap.fork
-               ; Cap.time_limit
-               ; Cap.memory_limit >)
-          conf profiler rules_and_origins targets_and_skipped
+        check_targets_with_rules caps conf profiler rules_and_origins
+          targets_and_skipped
       in
 
       (* step4: exit with the right exit code *)
@@ -795,14 +791,7 @@ let run_conf (caps : < caps ; .. >) (conf : Scan_CLI.conf) : Exit_code.t =
       (* TOPORT: if enable_version_check: version_check() *)
       Exit_code.ok ~__LOC__
   | _ when conf.test <> None ->
-      Test_subcommand.run_conf
-        (caps
-          :> < Cap.stdout
-             ; Cap.fork
-             ; Cap.time_limit
-             ; Cap.memory_limit
-             ; Cap.tmp >)
-        (Common2.some conf.test)
+      Test_subcommand.run_conf caps (Common2.some conf.test)
   | _ when conf.validate <> None ->
       Validate_subcommand.run_conf
         (caps
@@ -818,8 +807,8 @@ let run_conf (caps : < caps ; .. >) (conf : Scan_CLI.conf) : Exit_code.t =
         (caps :> < Cap.stdout ; Cap.network ; Cap.tmp >)
         (Common2.some conf.show)
   | _ when conf.ls ->
-      Ls_subcommand.run ~target_roots:conf.target_roots
-        ~targeting_conf:conf.targeting_conf ~format:conf.ls_format ()
+      Ls_subcommand.run caps ~target_roots:conf.target_roots
+        ~targeting_conf:conf.targeting_conf ~format:conf.ls_format
   | _ ->
       (* --------------------------------------------------------- *)
       (* Let's go, this is an actual scan subcommand *)
diff --git src/osemgrep/cli_scan/Scan_subcommand.mli src/osemgrep/cli_scan/Scan_subcommand.mli
index c744cd0711c6..f0b51fb5a99b 100644
--- src/osemgrep/cli_scan/Scan_subcommand.mli
+++ src/osemgrep/cli_scan/Scan_subcommand.mli
@@ -14,6 +14,7 @@ type caps =
   ; Cap.network
   ; Cap.tmp
   ; Cap.chdir
+  ; Cap.readdir
   ; Cap.fork
   ; Cap.time_limit
   ; Cap.memory_limit >
diff --git src/osemgrep/cli_show/Show_CLI.ml src/osemgrep/cli_show/Show_CLI.ml
index 1a0c56e53e95..8d5f09ba166e 100644
--- src/osemgrep/cli_show/Show_CLI.ml
+++ src/osemgrep/cli_show/Show_CLI.ml
@@ -125,8 +125,7 @@ let cmdline_term : conf Term.t =
       | [ "debug"; dir; root ] ->
           Debug { output_dir = Some (Fpath.v dir); root = Fpath.v root }
       | [ "debug"; root ] -> Debug { output_dir = None; root = Fpath.v root }
-      | [ "debug" ] ->
-          Debug { output_dir = None; root = Fpath.v @@ Sys.getcwd () }
+      | [ "debug" ] -> Debug { output_dir = None; root = Fpath.v "." }
       | [] ->
           Error.abort
             (spf
diff --git src/osemgrep/cli_show/Test_show_subcommand.ml src/osemgrep/cli_show/Test_show_subcommand.ml
index 2ba191a3c56e..02f7041b8266 100644
--- src/osemgrep/cli_show/Test_show_subcommand.ml
+++ src/osemgrep/cli_show/Test_show_subcommand.ml
@@ -139,13 +139,20 @@ let test_supported_languages (caps : caps) : Testo.t =
       in
       Exit_code.Check.ok exit_code)
 
+(* fragile: due to smart formatting by the Format module and IDs of variable
+   lengths, the output can vary from one test run to another even after
+   masking the variable IDs.
+   TODO: replace all sequences of blanks and newlines by a single newline?
+*)
 let test_dump_config (caps : caps) : Testo.t =
-  t ~checked_output:(Testo.stdout ())
+  t ~checked_output:(Testo.stdout ()) ~tags:[ Test_tags.flaky ]
     ~normalize:
       [
         (* because of the use of Xpattern.count global for pattern id *)
-        Testo.mask_line ~after:"pid = " ~before:" }" ();
-        Testo.mask_line ~after:"id_info_id = " ~before:" }" ();
+        Testo.mask_pcre_pattern "pid = [0-9]+" ~replace:(fun _ ->
+            "pid = <MASKED NUM>");
+        Testo.mask_pcre_pattern "id_info_id = [0-9]+" ~replace:(fun _ ->
+            "id_info_id = <MASKED NUM>");
       ]
     __FUNCTION__
     (fun () ->
diff --git src/osemgrep/cli_show/dune src/osemgrep/cli_show/dune
index 40f1fbe3b275..2f1679e8dfd6 100644
--- src/osemgrep/cli_show/dune
+++ src/osemgrep/cli_show/dune
@@ -12,6 +12,7 @@
     lwt
     networking.http_helpers
     lwt_platform
+    testo
 
     semgrep.parsing
     semgrep.parsing.tests ; Test_parsing.dump_tree_sitter_cst
diff --git src/osemgrep/cli_test/Test_subcommand.ml src/osemgrep/cli_test/Test_subcommand.ml
index 2037830a9246..ddae84762fd3 100644
--- src/osemgrep/cli_test/Test_subcommand.ml
+++ src/osemgrep/cli_test/Test_subcommand.ml
@@ -51,14 +51,13 @@ module A = Test_annotation
 (*****************************************************************************)
 (* Types and constants *)
 (*****************************************************************************)
-(* = Cap.stdout + Core_scan.caps + Cap.tmp (for Deep_scan.caps)
+(* = Cap.tmp is for Deep_scan.caps
  * (no need for Cap.network; the tested rules should be local)
  *)
-type caps =
-  < Cap.stdout ; Cap.fork ; Cap.time_limit ; Cap.memory_limit ; Cap.tmp >
+type caps = < Cap.stdout ; Core_scan.caps ; Cap.tmp ; Cap.readdir >
 
 (* Core_scan.caps | Deep_scan.caps *)
-type scan_caps = < Cap.fork ; Cap.time_limit ; Cap.memory_limit ; Cap.tmp >
+type scan_caps = < Core_scan.caps ; Cap.tmp >
 
 (* Rules and targets to test together.
  * Usually the target list contains just one file, but in some cases
@@ -140,11 +139,14 @@ let hook_deep_scan :
 (*****************************************************************************)
 
 (* TODO? Move to Rule_tests.ml? *)
-let find_targets_for_rule (rule_file : Fpath.t) : Fpath.t list =
+let find_targets_for_rule (caps : < Cap.readdir ; .. >) (rule_file : Fpath.t) :
+    Fpath.t list =
   let dir, base = Fpath.split_base rule_file in
   (* ex: "useless-if" (without the ".yaml") *)
   let base_no_ext = Fpath.rem_ext base in
-  dir |> List_files.read_dir_entries_fpath
+  dir
+  |> CapFS.read_dir_entries caps
+  |> List_.map Fpath.v
   |> List_.exclude (fun p ->
          Fpath.equal p base || List.mem "fixed" (Fpath_.exts p))
   |> List_.filter_map (fun p ->
@@ -153,8 +155,8 @@ let find_targets_for_rule (rule_file : Fpath.t) : Fpath.t list =
            Some (dir // p)
          else None)
 
-let rules_and_targets (kind : Test_CLI.target_kind) (errors : error list ref) :
-    tests =
+let rules_and_targets (caps : < Cap.readdir ; .. >)
+    (kind : Test_CLI.target_kind) (errors : error list ref) : tests =
   match kind with
   | Test_CLI.Dir (dir, None) ->
       (* coupling: similar to Test_engine.test_rules() *)
@@ -164,7 +166,7 @@ let rules_and_targets (kind : Test_CLI.target_kind) (errors : error list ref) :
       in
       rule_files
       |> List_.filter_map (fun (rule_file : Fpath.t) ->
-             match find_targets_for_rule rule_file with
+             match find_targets_for_rule caps rule_file with
              | [] ->
                  (* stricter: (but reported via config_missing_tests in JSON)*)
                  Logs.warn (fun m ->
@@ -858,7 +860,7 @@ let run_conf (caps : < caps ; .. >) (conf : Test_CLI.conf) : Exit_code.t =
   (* We now support multiple targets (e.g., .jsx/.tsx) analyzed independently.
    * TODO: multiple targets analyzed together for --pro interfile analysis.
    *)
-  let tests : tests = rules_and_targets conf.target errors in
+  let tests : tests = rules_and_targets caps conf.target errors in
 
   (* step2: run the tests *)
   let result : tests_result = run_tests caps conf tests errors in
diff --git src/osemgrep/cli_test/Test_subcommand.mli src/osemgrep/cli_test/Test_subcommand.mli
index f8e77fa35175..398509b3a4e8 100644
--- src/osemgrep/cli_test/Test_subcommand.mli
+++ src/osemgrep/cli_test/Test_subcommand.mli
@@ -1,6 +1,5 @@
-(* = Cap.stdout + Core_scan.caps | Deep_scan.caps *)
-type caps =
-  < Cap.stdout ; Cap.fork ; Cap.time_limit ; Cap.memory_limit ; Cap.tmp >
+(* tmp is for Deep_scan.caps *)
+type caps = < Cap.stdout ; Core_scan.caps ; Cap.tmp ; Cap.readdir >
 
 (*
    Parse a semgrep-test command, execute it and exit.
diff --git src/osemgrep/core_runner/Core_runner.ml src/osemgrep/core_runner/Core_runner.ml
index 007b5e2bd8fb..a0a120a6af71 100644
--- src/osemgrep/core_runner/Core_runner.ml
+++ src/osemgrep/core_runner/Core_runner.ml
@@ -43,6 +43,8 @@ type conf = {
    * even if it was not requested by the CLI
    *)
   dataflow_traces : bool;
+  (* set by the scan config from the app *)
+  symbol_analysis : bool;
 }
 [@@deriving show]
 
@@ -99,6 +101,7 @@ let default_conf : conf =
     time_flag = false;
     nosem = true;
     strict = false;
+    symbol_analysis = false;
   }
 
 (*****************************************************************************)
@@ -167,6 +170,7 @@ let core_scan_config_of_conf (conf : conf) : Core_scan_config.t =
    time_flag;
    (* TODO *)
    dataflow_traces = _;
+   symbol_analysis;
   } ->
       (* We do our own output in osemgrep, no need for Core_scan.scan() output *)
       let output_format = Core_scan_config.NoOutput in
@@ -193,6 +197,7 @@ let core_scan_config_of_conf (conf : conf) : Core_scan_config.t =
         respect_rule_paths = true;
         max_match_per_file = Core_scan_config.default.max_match_per_file;
         tracing = None;
+        symbol_analysis;
       }
 
 (* output adapter to Core_scan.scan.
diff --git src/osemgrep/core_runner/Core_runner.mli src/osemgrep/core_runner/Core_runner.mli
index cd79f89e6f09..71875eb98dd0 100644
--- src/osemgrep/core_runner/Core_runner.mli
+++ src/osemgrep/core_runner/Core_runner.mli
@@ -19,6 +19,10 @@ type conf = {
   time_flag : bool;
   matching_explanations : bool;
   dataflow_traces : bool;
+  (* extra scan-adjacent information
+     only set by the scan config from the app
+  *)
+  symbol_analysis : bool;
 }
 [@@deriving show]
 
diff --git src/osemgrep/networking/Rule_fetching.ml src/osemgrep/networking/Rule_fetching.ml
index 438aff65f372..57d4af551c46 100644
--- src/osemgrep/networking/Rule_fetching.ml
+++ src/osemgrep/networking/Rule_fetching.ml
@@ -373,7 +373,8 @@ let rules_from_dashdash_config_async ~rewrite_rule_ids ~token_opt caps kind :
        * we used to fetch rules from ~/.semgrep/ implicitely when --config
        * was not given, but this feature was removed, so now we can KISS.
        *)
-      List_files.list dir
+      let caps_dir = Cap.readdir_UNSAFE () in
+      List_files.list caps_dir dir
       |> List.filter Rule_file.is_valid_rule_filename
       |> List_.map (fun file ->
              load_rules_from_file ~rewrite_rule_ids ~origin:(Local_file file)
diff --git src/osemgrep/networking/Semgrep_App.ml src/osemgrep/networking/Semgrep_App.ml
index 0198fad22912..356841bf0ec1 100644
--- src/osemgrep/networking/Semgrep_App.ml
+++ src/osemgrep/networking/Semgrep_App.ml
@@ -82,6 +82,8 @@ let pro_binary_route (platform_kind : pro_engine_arch) =
   in
   "api/agent/deployments/deepbinary/" ^ arch_str
 
+let symbol_analysis_route scan_id = spf "/api/agent/scans/%d/symbols" scan_id
+
 (*****************************************************************************)
 (* Extractors *)
 (*****************************************************************************)
@@ -438,3 +440,57 @@ let upload_rule_to_registry_async caps json =
 
 let upload_rule_to_registry caps json =
   Lwt_platform.run (upload_rule_to_registry_async caps json)
+
+let upload_symbol_analysis_async caps ~token ~scan_id symbol_analysis :
+    (string, string) result Lwt.t =
+  try
+    let url =
+      Uri.with_path !Semgrep_envvars.v.semgrep_url
+        (symbol_analysis_route scan_id)
+    in
+    Logs.debug (fun m ->
+        m "Uploading symbol analysis for %d symbols"
+          (List.length symbol_analysis));
+
+    let headers =
+      [
+        ("Content-Type", "application/json");
+        ("User-Agent", spf "Semgrep/%s" Version.version);
+        Auth.auth_header_of_token token;
+      ]
+    in
+    let body = Out.string_of_symbol_analysis symbol_analysis in
+    match%lwt Http_helpers.post ~body ~headers caps#network url with
+    | Ok { body = Ok body; _ } -> Lwt.return_ok body
+    | Ok { body = Error msg; code; _ } ->
+        let msg =
+          spf
+            "Failed to upload symbol analysis, API server returned %u, this \
+             error: %s"
+            code msg
+        in
+        Lwt.return_error msg
+    | Error e ->
+        let msg = spf "Failed to upload symbol analysis: %s" e in
+        Lwt.return_error msg
+    (* `try ... with exn ->` is horrendous. But hear me out.
+       We are adding this symbol analysis information to arbitrary Semgrep scans, but they are not
+       very related to the actual scan, it's just scan-adjacent information that we want to
+       collect.
+       If there is any error related to symbol analysis, we do _not_ want it to affect the actual
+       scan. Any exception that this raises _cannot_ stop the show.
+       So let's catch it and log unconditionally, but don't crash the program.
+    *)
+  with
+  | exn ->
+      let msg =
+        spf
+          "Got show-stopping exception %s while trying to upload symbol \
+           analysis."
+          (Printexc.to_string exn)
+      in
+      Lwt.return_error msg
+
+let upload_symbol_analysis caps ~token ~scan_id symbol_analysis =
+  Lwt_platform.run
+    (upload_symbol_analysis_async caps ~token ~scan_id symbol_analysis)
diff --git src/osemgrep/networking/Semgrep_App.mli src/osemgrep/networking/Semgrep_App.mli
index dfc46e96bf09..8f314a61eb1f 100644
--- src/osemgrep/networking/Semgrep_App.mli
+++ src/osemgrep/networking/Semgrep_App.mli
@@ -84,6 +84,13 @@ val fetch_scan_config_string_async :
     rules (as a RAW string containing JSON data) for the provided
     configuration. *)
 
+val upload_symbol_analysis :
+  < Cap.network ; .. > ->
+  token:Auth.token ->
+  scan_id:int ->
+  Semgrep_output_v1_t.symbol_analysis ->
+  (string, string) result
+
 (*****************************************************************************)
 (* Async variants of functions above *)
 (*****************************************************************************)
@@ -114,3 +121,10 @@ val upload_rule_to_registry_async :
   < Cap.network ; Auth.cap_token ; .. > ->
   JSON.yojson ->
   (string, string) result Lwt.t
+
+val upload_symbol_analysis_async :
+  < Cap.network ; .. > ->
+  token:Auth.token ->
+  scan_id:int ->
+  Semgrep_output_v1_t.symbol_analysis ->
+  (string, string) result Lwt.t
diff --git src/osemgrep/networking/Unit_Login.ml src/osemgrep/networking/Unit_Login.ml
index 99016cc20ccf..668e3ddd0a55 100644
--- src/osemgrep/networking/Unit_Login.ml
+++ src/osemgrep/networking/Unit_Login.ml
@@ -83,20 +83,11 @@ let with_mock_envvars f () =
 let with_mock_envvars_and_normal_responses f =
   with_mock_normal_responses (with_mock_envvars f)
 
-let with_logged_in f =
-  let token = ok_token in
-  let caps = Cap.network_caps_UNSAFE () in
-  let caps = Auth.cap_token_and_network token caps in
-  match Semgrep_login.save_token caps with
-  | Ok _deployment_config -> f ()
-  | Error e -> failwith e
-
 (*****************************************************************************)
 (* Tests *)
 (*****************************************************************************)
 
 let save_token_tests caps =
-  ignore with_logged_in;
   let valid_token_test () =
     let caps = Auth.cap_token_and_network ok_token caps in
     match Semgrep_login.save_token caps with
diff --git src/osemgrep/reporting/Cli_json_output.ml src/osemgrep/reporting/Cli_json_output.ml
index 8ae3867d28a0..d0d922451362 100644
--- src/osemgrep/reporting/Cli_json_output.ml
+++ src/osemgrep/reporting/Cli_json_output.ml
@@ -300,6 +300,7 @@ let cli_match_of_core_match ~fixed_lines fixed_env (hrules : Rule.hrules)
         | None -> `Assoc []
         | Some json -> json
       in
+
       (* LATER: this should be a variant in semgrep_output_v1.atd
        * and merged with Constants.rule_severity
        *)
@@ -457,6 +458,18 @@ let adjust_fields_cli_outpout_logged_out (x : Out.cli_output) : Out.cli_output =
            } : Out.cli_match_extra =
              extra
            in
+           let metadata =
+             match metadata with
+             | `Assoc xs ->
+                 let xs =
+                   xs
+                   |> List_.exclude (fun (fld, _v) ->
+                          List.mem fld [ "semgrep.dev"; "semgrep.policy" ])
+                 in
+                 `Assoc xs
+             | _else_ -> metadata
+           in
+
            let extra =
              Out.
                {
@@ -464,7 +477,6 @@ let adjust_fields_cli_outpout_logged_out (x : Out.cli_output) : Out.cli_output =
                  message;
                  fix;
                  fixed_lines;
-                 (* TODO? metadata filtering? *)
                  metadata;
                  severity;
                  fingerprint = Gated_data.msg;
@@ -518,6 +530,11 @@ let cli_output_of_runner_result ~fixed_lines (core : Out.core_output)
    (* LATER *)
    rules_by_engine = _;
    engine_requested = _;
+   (* We deliberately choose not to embed the symbol analysis into the CLI
+      output, as it is conceivably quite large and irrelevant information
+      for the actual Semgrep scan.
+   *)
+   symbol_analysis = _;
   } ->
       (* TODO: not sure how it's sorted. Look at rule_match.py keys? *)
       let matches =
diff --git src/osemgrep/reporting/Sarif_output.ml src/osemgrep/reporting/Sarif_output.ml
index 6a8c00506b2c..8e21531c5d77 100644
--- src/osemgrep/reporting/Sarif_output.ml
+++ src/osemgrep/reporting/Sarif_output.ml
@@ -377,6 +377,9 @@ let result (ctx : Out.format_context) show_dataflow_traces
     | None -> []
     | Some exposure -> [ ("exposure", `String (Exposure.string_of exposure)) ]
   in
+  (* coupling: if you modify which fields are gated by ctx.is_logged_in update
+   * also https://semgrep.dev/docs/semgrep-appsec-platform/json-and-sarif#sarif
+   *)
   let fingerprints =
     if ctx.is_logged_in then
       [ ("matchBasedId/v1", cli_match.extra.fingerprint) ]
diff --git src/osemgrep/tests/Test_osemgrep.ml src/osemgrep/tests/Test_osemgrep.ml
index d859012f4dc7..bbbd477549f3 100644
--- src/osemgrep/tests/Test_osemgrep.ml
+++ src/osemgrep/tests/Test_osemgrep.ml
@@ -36,7 +36,8 @@ module TL = Test_login_subcommand
 
 (* no need for a token to access public rules in the registry *)
 let test_scan_config_registry_no_token (caps : CLI.caps) =
-  Testo.create __FUNCTION__ (fun () ->
+  Testo.create (* on some days, it sometimes fails together with 'LS specs' *)
+    ~tags:[ Test_tags.flaky ] __FUNCTION__ (fun () ->
       Testutil_files.with_tempdir ~chdir:true (fun _tmp_path ->
           let exit_code =
             CLI.main caps
diff --git src/osemgrep/tests/Test_target_selection.ml src/osemgrep/tests/Test_target_selection.ml
index de116fc02dec..e1deea840a57 100644
--- src/osemgrep/tests/Test_target_selection.ml
+++ src/osemgrep/tests/Test_target_selection.ml
@@ -7,13 +7,15 @@ open Printf
 (*
    List targets by invoking Find_targets.get_targets directly.
 *)
-let list_targets_internal ?(conf = Find_targets.default_conf) ?roots () =
+let list_targets_internal ?(conf = Find_targets.default_conf) ?roots caps =
   let roots =
     match roots with
     | None -> [ Scanning_root.of_string "." ]
     | Some roots -> roots
   in
-  let selected, _errors, _skipped = Find_targets.get_target_fpaths conf roots in
+  let selected, _errors, _skipped =
+    Find_targets.get_target_fpaths caps conf roots
+  in
   printf "Target files:\n";
   selected |> List.iter (fun fpath -> printf "  %s\n" (Fpath.to_string fpath))
 
@@ -46,16 +48,16 @@ type repo_with_tests = {
 
 let test_list_from_project_root =
   ( "list target files from project root (internal)",
-    fun _caps -> list_targets_internal () )
+    fun caps -> list_targets_internal caps )
 
 let test_cli_list_from_project_root =
   ("list target files from project root", fun caps -> osemgrep_ls caps)
 
 let test_list_targets_from_subdir ?roots cwd =
-  let func _caps =
+  let func caps =
     Testutil_files.with_chdir cwd (fun () ->
         printf "cwd: %s\n" (Sys.getcwd ());
-        list_targets_internal ?roots ())
+        list_targets_internal ?roots caps)
   in
   let name = "list target files from " ^ Fpath.to_string cwd in
   (name, func)
diff --git src/parsing/tests/Test_parsing.ml src/parsing/tests/Test_parsing.ml
index 83b8ae93b9c5..b2c8eaf02699 100644
--- src/parsing/tests/Test_parsing.ml
+++ src/parsing/tests/Test_parsing.ml
@@ -27,6 +27,10 @@ module Resp = Semgrep_output_v1_t
  * TODO: remove all those ~verbose parameter; just use Logs
  *)
 
+(*****************************************************************************)
+(* Types *)
+(*****************************************************************************)
+
 (*****************************************************************************)
 (* Helpers *)
 (*****************************************************************************)
@@ -247,8 +251,8 @@ let dump_tree_sitter_cst (lang : Lang.t) (file : Fpath.t) : unit =
            Tree_sitter_dockerfile.Boilerplate.dump_extras
   | _ -> failwith "lang not supported by ocaml-tree-sitter"
 
-let test_parse_tree_sitter lang root =
-  let paths = Find_targets_lang.get_target_fpaths root lang in
+let test_parse_tree_sitter (caps : < Cap.readdir ; .. >) lang root =
+  let paths = Find_targets_lang.get_target_fpaths caps root lang in
   let stat_list = ref [] in
   paths
   |> List.iter (fun file ->
@@ -336,7 +340,8 @@ let dump_lang_ast (lang : Lang.t) (file : Fpath.t) : unit =
    This is meant to run the same parsers as semgrep-core does for normal
    semgrep scans.
 *)
-let parsing_common (caps : < Cap.time_limit ; Cap.memory_limit >)
+let parsing_common
+    (caps : < Cap.time_limit ; Cap.memory_limit ; Cap.readdir ; .. >)
     ?(verbose = true) (lang : Lang.t) (root : Fpath.t) =
   let timeout_seconds = 10.0 in
   (* Without the use of Memory_limit below, we were getting some
@@ -375,7 +380,7 @@ let parsing_common (caps : < Cap.time_limit ; Cap.memory_limit >)
   Logs.info (fun m -> m "running with a timeout of %f.1s" timeout_seconds);
   Logs.info (fun m -> m "running with a memory limit of %d MiB" mem_limit_mb);
 
-  let paths = Find_targets_lang.get_target_fpaths root lang in
+  let paths = Find_targets_lang.get_target_fpaths caps root lang in
   (* TODO? remove the skipped returned? *)
   let skipped = [] in
   let stats =
@@ -440,8 +445,9 @@ let parsing_common (caps : < Cap.time_limit ; Cap.memory_limit >)
    be nice to find out about timeouts. I think the timeout threshold should
    in seconds/MB or equivalent units, not seconds per file."
 *)
-let parse_project (caps : < Cap.time_limit ; Cap.memory_limit >) ~verbose lang
-    name root =
+let parse_project
+    (caps : < Cap.time_limit ; Cap.memory_limit ; Cap.readdir ; .. >) ~verbose
+    lang name root =
   let stat_list, _skipped = parsing_common caps ~verbose lang root in
   let stat_list =
     List.filter (fun stat -> not stat.PS.have_timeout) stat_list
@@ -589,8 +595,8 @@ let diff_pfff_tree_sitter xs =
 (* Rule parsing *)
 (*****************************************************************************)
 
-let test_parse_rules root =
-  let targets = Find_targets_lang.get_target_fpaths root Lang.Yaml in
+let test_parse_rules (caps : < Cap.readdir ; .. >) root =
+  let targets = Find_targets_lang.get_target_fpaths caps root Lang.Yaml in
   targets
   |> List.iter (fun file ->
          Logs.info (fun m -> m "processing %s" !!file);
diff --git src/parsing/tests/Test_parsing.mli src/parsing/tests/Test_parsing.mli
index fccc6fce2267..47d69189b9f0 100644
--- src/parsing/tests/Test_parsing.mli
+++ src/parsing/tests/Test_parsing.mli
@@ -10,7 +10,7 @@
  *   {"total":111,"bad":0,"percent_correct":100.0}
  *)
 val parsing_stats :
-  < Cap.time_limit ; Cap.memory_limit > ->
+  < Cap.time_limit ; Cap.memory_limit ; Cap.readdir ; .. > ->
   ?json:bool ->
   ?verbose:bool ->
   Lang.t ->
@@ -21,7 +21,8 @@ val parsing_stats :
  * and stop the parsing at the tree-sitter CST level (it does not
  * try to convert this CST in the generic AST).
  *)
-val test_parse_tree_sitter : Lang.t -> Fpath.t (* root *) -> unit
+val test_parse_tree_sitter :
+  < Cap.readdir ; .. > -> Lang.t -> Fpath.t (* root *) -> unit
 
 (* Dump the tree-sitter CST of the given file (it automatically detects
  * the language and parser to use based on the filename extension). *)
@@ -40,8 +41,8 @@ val dump_lang_ast : Lang.t -> Fpath.t -> unit
  *)
 val diff_pfff_tree_sitter : Fpath.t list -> unit
 
-(* [test_parse_rules root] recursively explores [root] to
+(* [test_parse_rules caps root] recursively explores [root] to
  * find YAML files containing rules and check if they
  * parse correctly using Parse_rule.parse.
  *)
-val test_parse_rules : Fpath.t (* root *) -> unit
+val test_parse_rules : < Cap.readdir ; .. > -> Fpath.t (* root *) -> unit
diff --git src/reporting/Core_json_output.ml src/reporting/Core_json_output.ml
index 286e6310d5b6..87b048508169 100644
--- src/reporting/Core_json_output.ml
+++ src/reporting/Core_json_output.ml
@@ -416,11 +416,13 @@ let sca_to_sca (m : SCA_match.t) : Out.sca_match =
   in
   Out.
     {
-      reachable;
       reachability_rule;
       (* coupling: dependency_aware_rule.py:SCA_FINDING_SCHEMA *)
       sca_finding_schema = 20220913;
       dependency_match;
+      reachable;
+      (* TODO: use m.kind at some point *)
+      kind = None;
     }
 
 (* "unsafe" because can raise NoTokenLocation which is captured in
@@ -680,6 +682,7 @@ let core_output_of_matches_and_errors (res : Core_result.t) : Out.core_output =
         (List_.map (fun l -> Analyzer.to_string l) res.interfile_languages_used);
     engine_requested = Some `OSS;
     version = Version.version;
+    symbol_analysis = res.symbol_analysis;
   }
 [@@profiling]
 
diff --git src/rpc/RPC.ml src/rpc/RPC.ml
index 6809d1d84ab7..899b83821ddc 100644
--- src/rpc/RPC.ml
+++ src/rpc/RPC.ml
@@ -13,7 +13,7 @@ module Out = Semgrep_output_v1_j
 (* Dispatcher *)
 (*****************************************************************************)
 
-let handle_call (caps : < Cap.exec ; Cap.tmp >) :
+let handle_call (caps : < Cap.exec ; Cap.tmp ; Cap.network >) :
     Out.function_call -> (Out.function_return, string) result = function
   | `CallApplyFixes { dryrun; edits } ->
       let modified_file_count, fixed_lines = RPC_return.autofix dryrun edits in
@@ -48,6 +48,18 @@ let handle_call (caps : < Cap.exec ; Cap.tmp >) :
           Error
             "Dependency resolution is a proprietary feature, but semgrep-pro \
              has not been loaded")
+  | `CallUploadSymbolAnalysis (token, scan_id, symbol_analysis) -> (
+      (* Caps are kind of a crap shoot whyen working across programming language
+         boundaries anyways.
+      *)
+      let token = Auth.unsafe_token_of_string token in
+      match
+        Semgrep_App.upload_symbol_analysis
+          (caps :> < Cap.network >)
+          ~token ~scan_id symbol_analysis
+      with
+      | Error msg -> Error msg
+      | Ok msg -> Ok (`RetUploadSymbolAnalysis msg))
   | `CallDumpRulePartitions params -> (
       match !RPC_return.hook_dump_rule_partitions with
       | Some dump_rule_partitions ->
@@ -58,6 +70,16 @@ let handle_call (caps : < Cap.exec ; Cap.tmp >) :
           Error
             "Dump rule partitions is a proprietary feature, but semgreep-pro \
              has not been loaded")
+  | `CallTransitiveReachabilityFilter xs -> (
+      match !RPC_return.hook_transitive_reachability_filter with
+      | Some transitive_reachability_filter ->
+          let xs = transitive_reachability_filter xs in
+          Ok (`RetTransitiveReachabilityFilter xs)
+      | None ->
+          Error
+            "Transitive reachability is a proprietary feature, but semgrep-pro \
+             has not been loaded")
+  | `CallGetTargets _scanning_roots -> Error "Not yet implemented"
 
 (*****************************************************************************)
 (* Helpers *)
@@ -89,7 +111,7 @@ let write_packet chan str =
   flush chan
 
 (* Blocks until a request comes in, then handles it and sends the result back *)
-let handle_single_request (caps : < Cap.exec ; Cap.tmp >) =
+let handle_single_request (caps : < Cap.exec ; Cap.tmp ; Cap.network >) =
   let res =
     let/ call_str = read_packet stdin in
     let/ call =
@@ -120,10 +142,12 @@ let handle_single_request (caps : < Cap.exec ; Cap.tmp >) =
 (* Entry point *)
 (*****************************************************************************)
 
-let main (caps : < Cap.exec ; Cap.tmp >) =
+let main (caps : < Cap.exec ; Cap.tmp ; Cap.network >) =
   (* For some requests, such as SARIF formatting, we need to parse rules
    * so we need to init the parsers as well. *)
   Parsing_init.init ();
 
+  Http_helpers.set_client_ref (module Cohttp_lwt_unix.Client);
+
   (* For now, just handle one request and then exit. *)
   handle_single_request caps
diff --git src/rpc/RPC.mli src/rpc/RPC.mli
index b9dece69bdac..8b13e0be03fc 100644
--- src/rpc/RPC.mli
+++ src/rpc/RPC.mli
@@ -1,2 +1,5 @@
 (* Runs an RPC server that takes calls on stdin and sends results to stdout. *)
-val main : < Cap.exec ; Cap.tmp > -> unit
+(* - Cap.exec is needed to query Git for project contributions
+   - Cap.network is needed to POST symbol analysis back to the App
+*)
+val main : < Cap.exec ; Cap.tmp ; Cap.network > -> unit
diff --git src/rpc/RPC_return.ml src/rpc/RPC_return.ml
index dc31a8dac37f..d67d69f979c3 100644
--- src/rpc/RPC_return.ml
+++ src/rpc/RPC_return.ml
@@ -88,3 +88,4 @@ let validate (path : Out.fpath) : bool =
 (*****************************************************************************)
 let hook_resolve_dependencies = ref None
 let hook_dump_rule_partitions = ref None
+let hook_transitive_reachability_filter = ref None
diff --git src/rpc/RPC_return.mli src/rpc/RPC_return.mli
index ae66a5c21793..f6c716b94920 100644
--- src/rpc/RPC_return.mli
+++ src/rpc/RPC_return.mli
@@ -15,6 +15,7 @@ val sarif_format :
 val contributions : < Cap.exec > -> Out.contributions
 val validate : Out.fpath -> bool
 
+(* TODO: switch all those option ref to Hook.t *)
 val hook_resolve_dependencies :
   (< Cap.exec ; Cap.tmp > ->
   Out.dependency_source list ->
@@ -22,5 +23,8 @@ val hook_resolve_dependencies :
   option
   ref
 
+val hook_transitive_reachability_filter :
+  (Out.transitive_finding list -> Out.transitive_finding list) option ref
+
 val hook_dump_rule_partitions :
   (Out.raw_json -> int -> Fpath.t -> bool) option ref
diff --git src/rpc/dune src/rpc/dune
index 9749e22284b9..327893fba949 100644
--- src/rpc/dune
+++ src/rpc/dune
@@ -13,6 +13,8 @@
     ; we now also depends on spacegrep with full-rule-in-ocaml
     spacegrep
 
+    cohttp-lwt-unix
+
     ; internal deps in src/
     semgrep.core
     semgrep.core_scan
@@ -22,5 +24,7 @@
     semgrep.analyzing.tests ; Test_analyze_generic.actions
     semgrep.data
     semgrep.osemgrep_reporting
+    semgrep.osemgrep_networking
+    semgrep.osemgrep_core
  )
 )
diff --git src/targeting/Find_targets.ml src/targeting/Find_targets.ml
index 1d2fcf4f6e17..572f15cda98e 100644
--- src/targeting/Find_targets.ml
+++ src/targeting/Find_targets.ml
@@ -401,7 +401,7 @@ let filter_size_and_minified max_target_bytes exclude_minified_files paths =
  *
  * pre: the scan_root must be a path to a directory
  *)
-let walk_skip_and_collect (ign : Gitignore.filter)
+let walk_skip_and_collect (caps : < Cap.readdir ; .. >) (ign : Gitignore.filter)
     (include_filter : Include_filter.t option) (scan_root : Fppath.t) :
     Fppath.t list * Out.skipped_target list =
   Log.info (fun m ->
@@ -426,7 +426,7 @@ let walk_skip_and_collect (ign : Gitignore.filter)
             m "listing dir %s (ppath = %s)" !!(dir.fpath)
               (Ppath.to_string_for_tests dir.ppath));
         (* TODO? should we sort them first? *)
-        let entries = List_files.read_dir_entries dir.fpath in
+        let entries = CapFS.read_dir_entries caps dir.fpath in
         (* TODO: factorize code with filter_paths? *)
         entries
         |> List.iter (fun name ->
@@ -687,7 +687,8 @@ let filter_targets conf project_roots (all_files : Fppath.t list) =
   let ign = setup_path_filters conf project_roots in
   filter_paths ign all_files
 
-let get_targets_from_filesystem conf (project_roots : Project.scanning_roots) =
+let get_targets_from_filesystem (caps : < Cap.readdir ; .. >) (conf : conf)
+    (project_roots : Project.scanning_roots) =
   let ign, include_filter = setup_path_filters conf project_roots in
   List.fold_left
     (fun (selected, skipped) (scan_root : Project.scanning_root_info) ->
@@ -705,7 +706,7 @@ let get_targets_from_filesystem conf (project_roots : Project.scanning_roots) =
         match (Unix.stat !!phys_path).st_kind with
         (* TOPORT? make sure has right permissions (readable) *)
         | S_REG -> ([ fppath ], [])
-        | S_DIR -> walk_skip_and_collect ign include_filter fppath
+        | S_DIR -> walk_skip_and_collect caps ign include_filter fppath
         | S_LNK ->
             (* already dereferenced by Unix.stat *)
             raise Impossible
@@ -771,7 +772,8 @@ let force_select_scanning_roots (project_roots : Project.scanning_roots)
       Typically, the sets of files produced by (2) and (3) overlap vastly.
    4. Take the union of (2) and (3).
 *)
-let get_targets_for_project conf (project_roots : Project.scanning_roots) =
+let get_targets_for_project (caps : < Cap.readdir ; .. >) (conf : conf)
+    (project_roots : Project.scanning_roots) =
   Log.debug (fun m -> m "Find_target.get_targets_for_project");
   (* Obtain the list of files from git if possible because it does it
      faster than what we can do by scanning the filesystem: *)
@@ -793,7 +795,7 @@ let get_targets_for_project conf (project_roots : Project.scanning_roots) =
     (* Non-Git projects *)
     | None, _
     | _, None ->
-        get_targets_from_filesystem conf project_roots
+        get_targets_from_filesystem caps conf project_roots
   in
   let selected_targets, skipped_targets =
     force_select_scanning_roots project_roots selected_targets skipped_targets
@@ -825,7 +827,8 @@ let clone_if_remote_project_root conf =
 (* Entry point *)
 (*************************************************************************)
 
-let get_targets conf scanning_roots :
+let get_targets (caps : < Cap.readdir ; .. >) (conf : conf)
+    (scanning_roots : Scanning_root.t list) :
     Fppath.t list * Core_error.t list * Out.skipped_target list =
   clone_if_remote_project_root conf;
   (* Skipped scanning roots are more serious errors than ordinary skipped
@@ -835,7 +838,7 @@ let get_targets conf scanning_roots :
     scanning_roots |> group_scanning_roots_by_project conf
   in
   grouped_scanning_roots
-  |> List_.map (get_targets_for_project conf)
+  |> List_.map (get_targets_for_project caps conf)
   |> List_.split
   |> fun (path_set_list, skipped_paths_list) ->
   let paths, skipped_size_minified =
@@ -857,6 +860,6 @@ let get_targets conf scanning_roots :
   (paths, errors, sorted_skipped_targets)
 [@@profiling]
 
-let get_target_fpaths conf scanning_roots =
-  let selected, errors, skipped = get_targets conf scanning_roots in
+let get_target_fpaths (caps : < Cap.readdir ; .. >) conf scanning_roots =
+  let selected, errors, skipped = get_targets caps conf scanning_roots in
   (List_.map (fun { Fppath.fpath; _ } -> fpath) selected, errors, skipped)
diff --git src/targeting/Find_targets.mli src/targeting/Find_targets.mli
index ab7f5fdee00e..6197ebfc1d4f 100644
--- src/targeting/Find_targets.mli
+++ src/targeting/Find_targets.mli
@@ -97,6 +97,7 @@ val default_conf : conf
    This may raise Unix.Unix_error if the scanning root does not exist.
 *)
 val get_targets :
+  < Cap.readdir ; .. > ->
   conf ->
   Scanning_root.t list ->
   Fppath.t list * Core_error.t list * Semgrep_output_v1_t.skipped_target list
@@ -110,6 +111,7 @@ val get_targets :
  * files in tests/ directories.
  *)
 val get_target_fpaths :
+  < Cap.readdir ; .. > ->
   conf ->
   Scanning_root.t list ->
   Fpath.t list * Core_error.t list * Semgrep_output_v1_t.skipped_target list
diff --git src/targeting/Find_targets_lang.ml src/targeting/Find_targets_lang.ml
index cb6ae69a821c..c68eae08702c 100644
--- src/targeting/Find_targets_lang.ml
+++ src/targeting/Find_targets_lang.ml
@@ -2,7 +2,8 @@
  * of targets for a certain language. For real semgrep targeting, use the
  * Core_targeting module.
  *)
-let get_target_fpaths (root : Fpath.t) (lang : Lang.t) : Fpath.t list =
+let get_target_fpaths (caps : < Cap.readdir ; .. >) (root : Fpath.t)
+    (lang : Lang.t) : Fpath.t list =
   let conf =
     {
       Find_targets.default_conf with
@@ -15,7 +16,7 @@ let get_target_fpaths (root : Fpath.t) (lang : Lang.t) : Fpath.t list =
    * TODO? at least Logs the errors and skipped?
    *)
   let files, _errors, _skipped =
-    Find_targets.get_target_fpaths conf [ Scanning_root.of_fpath root ]
+    Find_targets.get_target_fpaths caps conf [ Scanning_root.of_fpath root ]
   in
   (* filter out files that are not relevant to the language here, because
    * `Find_targets` fetches _all_ the possibly relevant files it can.
diff --git src/targeting/Find_targets_lang.mli src/targeting/Find_targets_lang.mli
index 147f164a6b15..8983c6298745 100644
--- src/targeting/Find_targets_lang.mli
+++ src/targeting/Find_targets_lang.mli
@@ -8,4 +8,5 @@
  * and the targets will not be filtered by the toplevel .gitignore
  * or .semgrepignore of the repo containing those tests).
  *)
-val get_target_fpaths : Fpath.t (* root *) -> Lang.t -> Fpath.t list
+val get_target_fpaths :
+  < Cap.readdir ; .. > -> Fpath.t (* root *) -> Lang.t -> Fpath.t list
diff --git src/targeting/Unit_find_targets.ml src/targeting/Unit_find_targets.ml
index 897d93826e00..fdceccc52997 100644
--- src/targeting/Unit_find_targets.ml
+++ src/targeting/Unit_find_targets.ml
@@ -31,8 +31,8 @@ module Out = Semgrep_output_v1_t
                   (only relevant if with_git is true)
 *)
 let test_find_targets ?expected_outcome ?includes ?(excludes = [])
-    ?(non_git_files : F.t list = []) ~with_git ?(scanning_root = ".") name
-    (files : F.t list) =
+    ?(non_git_files : F.t list = []) ~with_git ?(scanning_root = ".")
+    (caps : < Cap.readdir ; .. >) name (files : F.t list) =
   let category = if with_git then "with git" else "without git" in
   let test_func () =
     printf "Test name: %s > %s\n" category name;
@@ -63,7 +63,7 @@ let test_find_targets ?expected_outcome ?includes ?(excludes = [])
           }
         in
         let targets, errors, skipped_targets =
-          Find_targets.get_target_fpaths conf
+          Find_targets.get_target_fpaths caps conf
             [ Scanning_root.of_fpath (Fpath.v scanning_root) ]
         in
         (match includes with
@@ -97,34 +97,35 @@ let test_find_targets ?expected_outcome ?includes ?(excludes = [])
         Testo.mask_line ~after:"(root-commit) " ~before:"]" ();
       ]
 
-let tests_with_or_without_git ~with_git =
+let tests_with_or_without_git caps ~with_git =
   [
-    test_find_targets ~with_git "basic test" [ F.File (".gitignore", "") ];
+    test_find_targets caps ~with_git "basic test" [ F.File (".gitignore", "") ];
     (* Select file 'a', not 'b' *)
-    test_find_targets ~with_git "basic gitignore"
+    test_find_targets caps ~with_git "basic gitignore"
       [ F.File (".gitignore", "b\n"); F.file "a"; F.file "b" ];
     (* Select file 'a', not 'b' *)
-    test_find_targets ~with_git "basic semgrepignore"
+    test_find_targets caps ~with_git "basic semgrepignore"
       [ F.File (".semgrepignore", "b\n"); F.file "a"; F.file "b" ];
     (* Select file 'a', not 'b' *)
-    test_find_targets ~with_git ~excludes:[ "b" ] "basic exclude"
+    test_find_targets caps ~with_git ~excludes:[ "b" ] "basic exclude"
       [ F.file "a"; F.file "b" ];
     (* Select file 'a', not 'b' *)
-    test_find_targets ~with_git ~includes:[ "a" ] "basic include"
+    test_find_targets caps ~with_git ~includes:[ "a" ] "basic include"
       [ F.file "a"; F.file "b" ];
     (* Select file 'a', not 'b' *)
-    test_find_targets ~with_git ~includes:[ "a" ] "deep include"
+    test_find_targets caps ~with_git ~includes:[ "a" ] "deep include"
       [ F.dir "dir" [ F.file "a"; F.file "b" ] ];
-    test_find_targets ~with_git ~scanning_root:"a.py" "scanning root as a file"
+    test_find_targets caps ~with_git ~scanning_root:"a.py"
+      "scanning root as a file"
       [ F.file "a.py" ];
     (* Select the symlink and not the regular file it's pointing to. *)
-    test_find_targets ~with_git ~scanning_root:"a.py"
+    test_find_targets caps ~with_git ~scanning_root:"a.py"
       "scanning root as a symlink to a regular file"
       [ F.Symlink ("a.py", "b.py"); F.File ("b.py", "some content") ];
-    test_find_targets ~with_git ~scanning_root:"a.py"
+    test_find_targets caps ~with_git ~scanning_root:"a.py"
       "scanning root as a symlink to a missing regular file"
       [ F.Symlink ("a.py", "b.py") ];
-    test_find_targets ~with_git ~scanning_root:"link-to-src"
+    test_find_targets caps ~with_git ~scanning_root:"link-to-src"
       "scanning root as a symlink to a folder"
       [ F.dir "src" [ F.file "a.py" ]; F.Symlink ("link-to-src", "src") ];
     (*
@@ -132,7 +133,7 @@ let tests_with_or_without_git ~with_git =
        filters.
     *)
     (* Can't select file 'a' via --include when semgrepignoring its folder. *)
-    test_find_targets ~with_git ~includes:[ "*.c" ]
+    test_find_targets caps ~with_git ~includes:[ "*.c" ]
       "semgrepignore file takes precedence over --include"
       [
         F.File (".semgrepignore", "dir\n");
@@ -142,12 +143,13 @@ let tests_with_or_without_git ~with_git =
     (* An explicit target is a scanning root that's also a target file
        and should not be ignored by the usual exclusion mechanisms
        (.semgrepignore, --include, --exclude) *)
-    test_find_targets ~with_git ~scanning_root:"a.py" "scan explicit target"
+    test_find_targets caps ~with_git ~scanning_root:"a.py"
+      "scan explicit target"
       [ F.file "a.py"; F.File (".semgrepignore", "a.py\n") ];
     (* Unspecified behavior: what to do with a scanning root that's
        a symlink to a file that's semgrepignored? Should it be considered
        an explicit target? This test assumes so. We could change it. *)
-    test_find_targets ~with_git ~scanning_root:"symlink.py"
+    test_find_targets caps ~with_git ~scanning_root:"symlink.py"
       "scan symlink to semgrepignored target"
       [
         F.symlink "symlink.py" "semgrepignored.py";
@@ -161,27 +163,27 @@ let tests_with_or_without_git ~with_git =
    for the special kind of projects 'Gitignore_project' which is used
    only in some tests.
 *)
-let tests_with_git_only =
+let tests_with_git_only caps =
   let with_git = true in
   [
     (* Select 'a' and 'c', not 'b'. *)
-    test_find_targets ~with_git "gitignore file is always consulted"
+    test_find_targets caps ~with_git "gitignore file is always consulted"
       ~non_git_files:[ F.file "a"; F.file "b" ]
       [ F.File (".gitignore", "b\n"); F.file "c" ];
     (* Can't select file 'a' via --include when gitignoring its folder. *)
-    test_find_targets ~with_git ~includes:[ "a" ]
+    test_find_targets caps ~with_git ~includes:[ "a" ]
       "gitignore file takes precedence over --include"
       [
         F.File (".gitignore", "dir\n");
         F.dir "dir" [ F.file "a"; F.file "b" ];
         F.file "c";
       ];
-    test_find_targets ~with_git "symlinks from git are filtered too"
+    test_find_targets caps ~with_git "symlinks from git are filtered too"
       [ F.Symlink ("lnk", "missing"); F.File ("a", "some content") ];
   ]
 
-let tests =
+let tests (caps : < Cap.readdir ; .. >) =
   Testo.categorize "Find_targets"
-    (tests_with_or_without_git ~with_git:true
-    @ tests_with_git_only
-    @ tests_with_or_without_git ~with_git:false)
+    (tests_with_or_without_git caps ~with_git:true
+    @ tests_with_git_only caps
+    @ tests_with_or_without_git caps ~with_git:false)
diff --git src/targeting/Unit_find_targets.mli src/targeting/Unit_find_targets.mli
index 3aab39abf737..b4b66a2ba7af 100644
--- src/targeting/Unit_find_targets.mli
+++ src/targeting/Unit_find_targets.mli
@@ -2,4 +2,4 @@
    Tests for the Find_targets module.
 *)
 
-val tests : Testo.t list
+val tests : < Cap.readdir ; .. > -> Testo.t list
diff --git src/targeting/Unit_guess_lang.ml src/targeting/Unit_guess_lang.ml
index 79fd35f676e6..2b929fc01f38 100644
--- src/targeting/Unit_guess_lang.ml
+++ src/targeting/Unit_guess_lang.ml
@@ -65,42 +65,25 @@ let contents_tests : (string * Lang.t * string * string * exec * success) list =
     ("php", Php, "foo.php", "", Nonexec, OK);
   ]
 
-let mkdir path = if not (Sys.file_exists path) then Unix.mkdir path 0o777
-
-(*
-   Create a temporary file with the specified name, in a local tmp folder.
-   We don't delete the files when we're done because it's easier when
-   troubleshooting tests.
-*)
-let with_file name contents exec f =
-  let dir = Fpath.v "tmp" in
-  mkdir !!dir;
-  let path = dir / name in
-  let oc = open_out_bin !!path in
-  (match exec with
-  | Exec -> Unix.chmod !!path 0o755
-  | Nonexec -> ());
-  Common.protect
-    ~finally:(fun () -> close_out oc)
-    (fun () ->
-      output_string oc contents;
-      close_out oc;
-      f path)
-
 let test_name_only lang path expectation =
   match (expectation, Guess_lang.inspect_file lang path) with
   | OK, Ok _
   | XFAIL, Error _ ->
       ()
-  | _ -> assert false
+  | OK, Error _ -> assert false
+  | XFAIL, Ok _ -> assert false
 
 let test_with_contents lang name contents exec expectation =
-  with_file name contents exec (fun path ->
+  Testo.with_temp_file ~suffix:name ~contents (fun path ->
+      (match exec with
+      | Exec -> Unix.chmod !!path 0o755
+      | Nonexec -> ());
       match (expectation, Guess_lang.inspect_file lang path) with
       | OK, Ok _
       | XFAIL, Error _ ->
           ()
-      | _ -> assert false)
+      | OK, Error _ -> assert false
+      | XFAIL, Ok _ -> assert false)
 
 let test_inspect_file =
   List_.map
diff --git src/tests/Test.ml src/tests/Test.ml
index 2a7c27f1bf85..add2d937c322 100644
--- src/tests/Test.ml
+++ src/tests/Test.ml
@@ -112,9 +112,9 @@ let tests (caps : Cap.all_caps) =
   List_.flatten
     [
       Commons_tests.tests;
-      Unit_list_files.tests;
+      Unit_list_files.tests (caps :> < Cap.readdir >);
       Glob.Unit_glob.tests;
-      Unit_find_targets.tests;
+      Unit_find_targets.tests (caps :> < Cap.readdir >);
       Unit_semgrepignore.tests;
       Unit_gitignore.tests;
       Unit_include_filter.tests;
@@ -141,9 +141,9 @@ let tests (caps : Cap.all_caps) =
       Unit_matcher.tests ~any_gen_of_string;
       (* TODO Unit_matcher.spatch_unittest ~xxx *)
       (* TODO Unit_matcher_php.unittest; sgrep/spatch/refactoring/unparsing *)
,-      Unit_engine.tests ();
+      Unit_engine.tests (caps :> < Cap.readdir >);
       Unit_jsonnet.tests (caps :> < Cap.time_limit >);
-      Unit_metachecking.tests (caps :> Core_scan.caps);
+      Unit_metachecking.tests (caps :> < Core_scan.caps ; Cap.readdir >);
       (* osemgrep unit tests *)
       Unit_LS.tests (caps :> Session.caps);
       Unit_Login.tests caps;
diff --git tests/snapshots/semgrep-core/4f06f5cf1f9b/stdout tests/snapshots/semgrep-core/4f06f5cf1f9b/stdout
index dbc5fe1630c3..64af3b5b2424 100644
--- tests/snapshots/semgrep-core/4f06f5cf1f9b/stdout
+++ tests/snapshots/semgrep-core/4f06f5cf1f9b/stdout
@@ -28,7 +28,7 @@ rule.yml
                                                      id_type = ref (None);
                                                      id_svalue = ref (None);
                                                      id_flags = ref (0);
-                                                     id_info_id = <MASKED> }
+                                                     id_info_id = <MASKED NUM> }
                                                    )));
                                              e_id = 0; e_range = None;
                                              is_implicit_return = false;
@@ -44,7 +44,7 @@ rule.yml
                                                       id_type = ref (None);
                                                       id_svalue = ref (None);
                                                       id_flags = ref (0);
-                                                      id_info_id = <MASKED> }
+                                                      id_info_id = <MASKED NUM> }
                                                     )));
                                               e_id = 0; e_range = None;
                                               is_implicit_return = false;
@@ -56,7 +56,7 @@ rule.yml
                                    is_implicit_return = false;
                                    facts = <opaque> }),
                               Python));
-                           pstr = ("$X == $X", ()); pid = <MASKED> });
+                           pstr = ("$X == $X", ()); pid = <MASKED NUM> });
                       conditions = []; focus = []; fix = None; as_ = None }
                      ]
                    ));

Description

This PR includes several major changes to the Semgrep codebase:

  1. Upgrades Python version from 3.11 to 3.12
  2. Adds capabilities for handling Windows builds
  3. Adds proper file targeting and symbol analysis capabilities
  4. Improves error handling for Git worktree operations
  5. Adds more robust capability checking throughout the codebase

The motivation appears to be improving Semgrep's multi-platform support while also adding new features for dependency and symbol analysis.

Possible Issues

  • The change to Python 3.12 could affect users who haven't upgraded their Python installations
  • Windows build changes may need extensive testing
  • There is a complex dependency resolution path relying on pkg-config that needs to be tested

Security Hotspots

  1. File System Access:
let readdir _caps = Unix.readdir

The new capability-based file system access needs careful auditing to ensure proper permissions are enforced.

  1. Symbol Analysis Upload:
val upload_symbol_analysis :
  token:Auth.token ->
  scan_id:int -> 
  symbol_analysis -> unit

The symbol analysis upload functionality needs verification that sensitive symbol information isn't leaked.

Changes

Changes

Key files modified:

  • Build & Infrastructure:

    • .github/workflows/*: Updates Python version and Windows build configurations
    • Dockerfile: Upgrades Alpine base image and Python version
    • Makefile: Improves Windows dependency handling
  • Core Functionality:

    • src/core/Core_result.ml: Adds symbol analysis support
    • src/targeting/Find_targets.ml: Improves file targeting with capabilities
    • libs/commons/CapFS.ml: New capability-based filesystem operations
    • TCB/Cap.ml: Expands capability system
    • cli/src/semgrep/git.py: Improves Git worktree handling
sequenceDiagram 
    participant CLI as Semgrep CLI
    participant Core as Semgrep Core
    participant FS as File System
    participant API as Semgrep API

    CLI->>Core: Initialize scan
    Core->>FS: Initialize CapFS capabilities
    Core->>FS: List target files
    FS-->>Core: Return filtered files
    CLI->>Core: Run analysis
    Core->>Core: Perform symbol analysis
    Core->>API: Upload symbol analysis
    Core-->>CLI: Return scan results
Loading

@renovate renovate bot force-pushed the renovate/semgrep-1.x branch from d3076ef to f6a6c20 Compare February 26, 2025 19:06
@renovate renovate bot changed the title chore(deps): update dependency semgrep to ~=1.108.0 chore(deps): update dependency semgrep to ~=1.109.0 Feb 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants