diff --git a/docs/config.md b/docs/config.md
index afac78a27..cc2538be5 100644
--- a/docs/config.md
+++ b/docs/config.md
@@ -197,9 +197,13 @@ footer: ({path}) => `
+## preserveIndex
-Whether page links should be “clean”, _i.e._, formatted without a `.html` extension. Defaults to true. If true, a link to `config.html` will be formatted as `config`. Regardless of this setting, a link to an index page will drop the implied `index.html`; for example `foo/index.html` will be formatted as `foo/`.
+Whether page links should preserve `/index` for directories. Defaults to false. If true, a link to `/` will be formatted as `/index` if the **preserveExtension** option is false or `/index.html` if the **preserveExtension** option is true.
+
+## preserveExtension
+
+Whether page links should preserve the `.html` extension. Defaults to false. If true, a link to `/foo` will be formatted as `/foo.html`.
## toc
@@ -297,6 +301,45 @@ export default {
};
```
+## duckdb
+
+The **duckdb** option configures [self-hosting](./lib/duckdb#self-hosting-of-extensions) and loading of [DuckDB extensions](./lib/duckdb#extensions) for use in [SQL code blocks](./sql) and the `sql` and `DuckDBClient` built-ins. For example, a geospatial data app might enable the [`spatial`](https://duckdb.org/docs/extensions/spatial/overview.html) and [`h3`](https://duckdb.org/community_extensions/extensions/h3.html) extensions like so:
+
+```js run=false
+export default {
+ duckdb: {
+ extensions: ["spatial", "h3"]
+ }
+};
+```
+
+The **extensions** option can either be an array of extension names, or an object whose keys are extension names and whose values are configuration options for the given extension, including its **source** repository (defaulting to the keyword _core_ for core extensions, and otherwise _community_; can also be a custom repository URL), whether to **load** it immediately (defaulting to true, except for known extensions that support autoloading), and whether to **install** it (_i.e._ to self-host, defaulting to true). As additional shorthand, you can specify `[name]: true` to install and load the named extension from the default (_core_ or _community_) source repository, or `[name]: string` to install and load the named extension from the given source repository.
+
+The configuration above is equivalent to:
+
+```js run=false
+export default {
+ duckdb: {
+ extensions: {
+ spatial: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: true
+ },
+ h3: {
+ source: "https://community-extensions.duckdb.org/",
+ install: true,
+ load: true
+ }
+ }
+ }
+};
+```
+
+The `json` and `parquet` are configured (and therefore self-hosted) by default. To expressly disable self-hosting of extension, you can set its **install** property to false, or equivalently pass null as the extension configuration object.
+
+For more, see [DuckDB extensions](./lib/duckdb#extensions).
+
## markdownIt
A hook for registering additional [markdown-it](https://github.com/markdown-it/markdown-it) plugins. For example, to use [markdown-it-footnote](https://github.com/markdown-it/markdown-it-footnote), first install the plugin with either `npm add markdown-it-footnote` or `yarn add markdown-it-footnote`, then register it like so:
diff --git a/docs/getting-started.md b/docs/getting-started.md
index 9587bec7f..ede7d4390 100644
--- a/docs/getting-started.md
+++ b/docs/getting-started.md
@@ -535,7 +535,7 @@ The build command generates the `dist` directory; you can then copy
npx http-server dist
-
By default, Framework generates “clean” URLs by dropping the `.html` extension from page links. Not all webhosts support this; some need the cleanUrls config option set to false.
+
By default, Framework generates “clean” URLs by dropping the `.html` extension from page links. Not all webhosts support this; some need the preserveExtension config option set to true.
When deploying to GitHub Pages without using GitHub’s related actions (configure-pages,
deploy-pages, and
diff --git a/docs/lib/duckdb.md b/docs/lib/duckdb.md
index 7fc98451d..1cdfd72a0 100644
--- a/docs/lib/duckdb.md
+++ b/docs/lib/duckdb.md
@@ -65,7 +65,7 @@ const db2 = await DuckDBClient.of({base: FileAttachment("quakes.db")});
db2.queryRow(`SELECT COUNT() FROM base.events`)
```
-For externally-hosted data, you can create an empty `DuckDBClient` and load a table from a SQL query, say using [`read_parquet`](https://duckdb.org/docs/guides/import/parquet_import) or [`read_csv`](https://duckdb.org/docs/guides/import/csv_import). DuckDB offers many affordances to make this easier (in many cases it detects the file format and uses the correct loader automatically).
+For externally-hosted data, you can create an empty `DuckDBClient` and load a table from a SQL query, say using [`read_parquet`](https://duckdb.org/docs/guides/import/parquet_import) or [`read_csv`](https://duckdb.org/docs/guides/import/csv_import). DuckDB offers many affordances to make this easier. (In many cases it detects the file format and uses the correct loader automatically.)
```js run=false
const db = await DuckDBClient.of();
@@ -105,3 +105,96 @@ const sql = DuckDBClient.sql({quakes: `https://earthquake.usgs.gov/earthquakes/f
```sql echo
SELECT * FROM quakes ORDER BY updated DESC;
```
+
+## Extensions
+
+[DuckDB extensions](https://duckdb.org/docs/extensions/overview.html) extend DuckDB’s functionality, adding support for additional file formats, new types, and domain-specific functions. For example, the [`json` extension](https://duckdb.org/docs/data/json/overview.html) provides a `read_json` method for reading JSON files:
+
+```sql echo
+SELECT bbox FROM read_json('https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_day.geojson');
+```
+
+To read a local file (or data loader), use `FileAttachment` and interpolation `${…}`:
+
+```sql echo
+SELECT bbox FROM read_json(${FileAttachment("../quakes.json").href});
+```
+
+For convenience, Framework configures the `json` and `parquet` extensions by default. Some other [core extensions](https://duckdb.org/docs/extensions/core_extensions.html) also autoload, meaning that you don’t need to explicitly enable them; however, Framework will only [self-host extensions](#self-hosting-of-extensions) if you explicitly configure them, and therefore we recommend that you always use the [**duckdb** config option](../config#duckdb) to configure DuckDB extensions. Any configured extensions will be automatically [installed and loaded](https://duckdb.org/docs/extensions/overview#explicit-install-and-load), making them available in SQL code blocks as well as the `sql` and `DuckDBClient` built-ins.
+
+For example, to configure the [`spatial` extension](https://duckdb.org/docs/extensions/spatial/overview.html):
+
+```js run=false
+export default {
+ duckdb: {
+ extensions: ["spatial"]
+ }
+};
+```
+
+You can then use the `ST_Area` function to compute the area of a polygon:
+
+```sql echo run=false
+SELECT ST_Area('POLYGON((0 0, 0 1, 1 1, 1 0, 0 0))'::GEOMETRY) as area;
+```
+
+To tell which extensions have been loaded, you can run the following query:
+
+```sql echo
+FROM duckdb_extensions() WHERE loaded;
+```
+
+
+
+If the `duckdb_extensions()` function runs before DuckDB autoloads a core extension (such as `json`), it might not be included in the returned set.
+
+
+
+### Self-hosting of extensions
+
+As with [npm imports](../imports#self-hosting-of-npm-imports), configured DuckDB extensions are self-hosted, improving performance, stability, & security, and allowing you to develop offline. Extensions are downloaded to the DuckDB cache folder, which lives in .observablehq/cache/\_duckdb within the source root (typically `src`). You can clear the cache and restart the preview server to re-fetch the latest versions of any DuckDB extensions. If you use an [autoloading core extension](https://duckdb.org/docs/extensions/core_extensions.html#list-of-core-extensions) that is not configured, DuckDB-Wasm [will load it](https://duckdb.org/docs/api/wasm/extensions.html#fetching-duckdb-wasm-extensions) from the default extension repository, `extensions.duckdb.org`, at runtime.
+
+## Configuring
+
+The second argument to `DuckDBClient.of` and `DuckDBClient.sql` is a [`DuckDBConfig`](https://shell.duckdb.org/docs/interfaces/index.DuckDBConfig.html) object which configures the behavior of DuckDB-Wasm. By default, Framework sets the `castBigIntToDouble` and `castTimestampToDate` query options to true. To instead use [`BigInt`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt):
+
+```js run=false
+const bigdb = DuckDBClient.of({}, {query: {castBigIntToDouble: false}});
+```
+
+By default, `DuckDBClient.of` and `DuckDBClient.sql` automatically load all [configured extensions](#extensions). To change the loaded extensions for a particular `DuckDBClient`, use the **extensions** config option. For example, pass an empty array to instantiate a DuckDBClient with no loaded extensions (even if your configuration lists several):
+
+```js echo run=false
+const simpledb = DuckDBClient.of({}, {extensions: []});
+```
+
+Alternatively, you can configure extensions to be self-hosted but not load by default using the **duckdb** config option and the `load: false` shorthand:
+
+```js run=false
+export default {
+ duckdb: {
+ extensions: {
+ spatial: false,
+ h3: false
+ }
+ }
+};
+```
+
+You can then selectively load extensions as needed like so:
+
+```js echo run=false
+const geosql = DuckDBClient.sql({}, {extensions: ["spatial", "h3"]});
+```
+
+In the future, we’d like to allow DuckDB to be configured globally (beyond just [extensions](#extensions)) via the [**duckdb** config option](../config#duckdb); please upvote [#1791](https://github.com/observablehq/framework/issues/1791) if you are interested in this feature.
+
+## Versioning
+
+Framework currently uses [DuckDB-Wasm 1.29.0](https://github.com/duckdb/duckdb-wasm/releases/tag/v1.29.0), which aligns with [DuckDB 1.1.1](https://github.com/duckdb/duckdb/releases/tag/v1.1.1). You can load a different version of DuckDB-Wasm by importing `npm:@duckdb/duckdb-wasm` directly, for example:
+
+```js run=false
+import * as duckdb from "npm:@duckdb/duckdb-wasm@1.28.0";
+```
+
+However, you will not be able to change the version of DuckDB-Wasm used by SQL code blocks or the `sql` or `DuckDBClient` built-ins, nor can you use Framework’s support for self-hosting extensions with a different version of DuckDB-Wasm.
diff --git a/docs/project-structure.md b/docs/project-structure.md
index 3d36774ce..ef625e8dc 100644
--- a/docs/project-structure.md
+++ b/docs/project-structure.md
@@ -99,7 +99,7 @@ For this site, routes map to files as:
/hello → dist/hello.html → src/hello.md
```
-This assumes [“clean URLs”](./config#clean-urls) as supported by most static site servers; `/hello` can also be accessed as `/hello.html`, and `/` can be accessed as `/index` and `/index.html`. (Some static site servers automatically redirect to clean URLs, but we recommend being consistent when linking to your site.)
+This assumes [“clean URLs”](./config#preserve-extension) as supported by most static site servers; `/hello` can also be accessed as `/hello.html`, and `/` can be accessed as `/index` and `/index.html`. (Some static site servers automatically redirect to clean URLs, but we recommend being consistent when linking to your site.)
Apps should always have a top-level `index.md` in the source root; this is your app’s home page, and it’s what people visit by default.
diff --git a/docs/sql.md b/docs/sql.md
index 3796f1a53..4748989ca 100644
--- a/docs/sql.md
+++ b/docs/sql.md
@@ -29,7 +29,7 @@ sql:
For performance and reliability, we recommend using local files rather than loading data from external servers at runtime. You can use a data loader to take a snapshot of a remote data during build if needed.
-You can also register tables via code (say to have sources that are defined dynamically via user input) by defining the `sql` symbol with [DuckDBClient.sql](./lib/duckdb).
+You can also register tables via code (say to have sources that are defined dynamically via user input) by defining the `sql` symbol with [DuckDBClient.sql](./lib/duckdb). To register [DuckDB extensions](./lib/duckdb#extensions), use the [**duckdb** config option](./config#duckdb).
## SQL code blocks
diff --git a/package.json b/package.json
index 2ef44b4a3..54d78bfa1 100644
--- a/package.json
+++ b/package.json
@@ -24,11 +24,12 @@
"docs:deploy": "tsx --no-warnings=ExperimentalWarning ./src/bin/observable.ts deploy",
"build": "rimraf dist && node build.js --outdir=dist --outbase=src \"src/**/*.{ts,js,css}\" --ignore \"**/*.d.ts\"",
"test": "concurrently npm:test:mocha npm:test:tsc npm:test:lint npm:test:prettier",
- "test:coverage": "c8 --check-coverage --lines 80 --per-file yarn test:mocha",
- "test:build": "rimraf test/build && cross-env npm_package_version=1.0.0-test node build.js --sourcemap --outdir=test/build \"{src,test}/**/*.{ts,js,css}\" --ignore \"test/input/**\" --ignore \"test/output/**\" --ignore \"test/preview/dashboard/**\" --ignore \"**/*.d.ts\" && cp -r templates test/build",
- "test:mocha": "yarn test:build && rimraf --glob test/.observablehq/cache test/input/build/*/.observablehq/cache && cross-env OBSERVABLE_TELEMETRY_DISABLE=1 TZ=America/Los_Angeles mocha --timeout 30000 -p \"test/build/test/**/*-test.js\" && yarn test:annotate",
- "test:mocha:serial": "yarn test:build && rimraf --glob test/.observablehq/cache test/input/build/*/.observablehq/cache && cross-env OBSERVABLE_TELEMETRY_DISABLE=1 TZ=America/Los_Angeles mocha --timeout 30000 \"test/build/test/**/*-test.js\" && yarn test:annotate",
- "test:annotate": "yarn test:build && cross-env OBSERVABLE_ANNOTATE_FILES=true TZ=America/Los_Angeles mocha --timeout 30000 \"test/build/test/**/annotate.js\"",
+ "test:coverage": "c8 --check-coverage --lines 80 --per-file yarn test:mocha:all",
+ "test:build": "rimraf test/build && rimraf --glob test/.observablehq/cache test/input/build/*/.observablehq/cache && cross-env npm_package_version=1.0.0-test node build.js --sourcemap --outdir=test/build \"{src,test}/**/*.{ts,js,css}\" --ignore \"test/input/**\" --ignore \"test/output/**\" --ignore \"test/preview/dashboard/**\" --ignore \"**/*.d.ts\" && cp -r templates test/build",
+ "test:mocha": "yarn test:mocha:serial -p",
+ "test:mocha:serial": "yarn test:build && cross-env OBSERVABLE_TELEMETRY_DISABLE=1 TZ=America/Los_Angeles mocha --timeout 30000 \"test/build/test/**/*-test.js\"",
+ "test:mocha:annotate": "yarn test:build && cross-env OBSERVABLE_TELEMETRY_DISABLE=1 OBSERVABLE_ANNOTATE_FILES=true TZ=America/Los_Angeles mocha --timeout 30000 \"test/build/test/**/annotate.js\"",
+ "test:mocha:all": "yarn test:mocha && cross-env OBSERVABLE_TELEMETRY_DISABLE=1 OBSERVABLE_ANNOTATE_FILES=true TZ=America/Los_Angeles mocha --timeout 30000 \"test/build/test/**/annotate.js\"",
"test:lint": "eslint src test --max-warnings=0",
"test:prettier": "prettier --check src test",
"test:tsc": "tsc --noEmit",
@@ -56,7 +57,8 @@
"dependencies": {
"@clack/prompts": "^0.7.0",
"@observablehq/inputs": "^0.12.0",
- "@observablehq/runtime": "^5.9.4",
+ "@observablehq/inspector": "^5.0.1",
+ "@observablehq/runtime": "^6.0.0-rc.1",
"@rollup/plugin-commonjs": "^25.0.7",
"@rollup/plugin-json": "^6.1.0",
"@rollup/plugin-node-resolve": "^15.2.3",
diff --git a/src/build.ts b/src/build.ts
index ccf9df04d..0f06635a4 100644
--- a/src/build.ts
+++ b/src/build.ts
@@ -3,6 +3,7 @@ import {existsSync} from "node:fs";
import {copyFile, readFile, rm, stat, writeFile} from "node:fs/promises";
import {basename, dirname, extname, join} from "node:path/posix";
import type {Config} from "./config.js";
+import {getDuckDBManifest} from "./duckdb.js";
import {CliError} from "./error.js";
import {getClientPath, prepareOutput} from "./files.js";
import {findModule, getModuleHash, readJavaScript} from "./javascript/module.js";
@@ -53,7 +54,7 @@ export async function build(
{config}: BuildOptions,
effects: BuildEffects = new FileBuildEffects(config.output, join(config.root, ".observablehq", "cache"))
): Promise {
- const {root, loaders} = config;
+ const {root, loaders, duckdb} = config;
Telemetry.record({event: "build", step: "start"});
// Prepare for build (such as by emptying the existing output root).
@@ -140,6 +141,21 @@ export async function build(
effects.logger.log(cachePath);
}
+ // Copy over the DuckDB extensions, initializing aliases that are needed to
+ // construct the DuckDB manifest.
+ for (const path of globalImports) {
+ if (path.startsWith("/_duckdb/")) {
+ const sourcePath = join(cacheRoot, path);
+ effects.output.write(`${faint("build")} ${path} ${faint("→")} `);
+ const contents = await readFile(sourcePath);
+ const hash = createHash("sha256").update(contents).digest("hex").slice(0, 8);
+ const [, , , version, bundle, name] = path.split("/");
+ const alias = join("/_duckdb/", `${basename(name, ".duckdb_extension.wasm")}-${hash}`, version, bundle, name);
+ aliases.set(path, alias);
+ await effects.writeFile(alias, contents);
+ }
+ }
+
// Generate the client bundles. These are initially generated into the cache
// because we need to rewrite any npm and node imports to be hashed; this is
// handled generally for all global imports below.
@@ -149,6 +165,7 @@ export async function build(
effects.output.write(`${faint("bundle")} ${path} ${faint("→")} `);
const clientPath = getClientPath(path === "/_observablehq/client.js" ? "index.js" : path.slice("/_observablehq/".length)); // prettier-ignore
const define: {[key: string]: string} = {};
+ if (path === "/_observablehq/stdlib/duckdb.js") define["DUCKDB_MANIFEST"] = JSON.stringify(await getDuckDBManifest(duckdb, {root, aliases})); // prettier-ignore
const contents = await rollupClient(clientPath, root, path, {minify: true, keepNames: true, define});
await prepareOutput(cachePath);
await writeFile(cachePath, contents);
@@ -202,9 +219,10 @@ export async function build(
// Copy over global assets (e.g., minisearch.json, DuckDB’s WebAssembly).
// Anything in _observablehq also needs a content hash, but anything in _npm
- // or _node does not (because they are already necessarily immutable).
+ // or _node does not (because they are already necessarily immutable). We’re
+ // skipping DuckDB’s extensions because they were previously copied above.
for (const path of globalImports) {
- if (path.endsWith(".js")) continue;
+ if (path.endsWith(".js") || path.startsWith("/_duckdb/")) continue;
const sourcePath = join(cacheRoot, path);
effects.output.write(`${faint("build")} ${path} ${faint("→")} `);
if (path.startsWith("/_observablehq/")) {
diff --git a/src/client/runtime.js b/src/client/runtime.js
index d6c74f4a7..758e28008 100644
--- a/src/client/runtime.js
+++ b/src/client/runtime.js
@@ -1 +1,2 @@
-export {Inspector, Runtime, RuntimeError} from "@observablehq/runtime";
+export {Inspector} from "@observablehq/inspector";
+export {Runtime, RuntimeError} from "@observablehq/runtime";
diff --git a/src/client/stdlib.js b/src/client/stdlib.js
index 6a891820a..451182c42 100644
--- a/src/client/stdlib.js
+++ b/src/client/stdlib.js
@@ -2,5 +2,3 @@ export {AbstractFile, FileAttachment, registerFile} from "./stdlib/fileAttachmen
export * as Generators from "./stdlib/generators/index.js";
export {Mutable} from "./stdlib/mutable.js";
export {resize} from "./stdlib/resize.js";
-export class Library {} // TODO remove @observablehq/runtime dependency
-export const FileAttachments = undefined; // TODO remove @observablehq/runtime dependency
diff --git a/src/client/stdlib/duckdb.js b/src/client/stdlib/duckdb.js
index 950e5bebc..42be646ad 100644
--- a/src/client/stdlib/duckdb.js
+++ b/src/client/stdlib/duckdb.js
@@ -1,3 +1,4 @@
+/* global DUCKDB_MANIFEST */
import * as duckdb from "npm:@duckdb/duckdb-wasm";
// Adapted from https://observablehq.com/@cmudig/duckdb-client
@@ -29,17 +30,21 @@ import * as duckdb from "npm:@duckdb/duckdb-wasm";
// ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
// POSSIBILITY OF SUCH DAMAGE.
-const bundle = await duckdb.selectBundle({
- mvp: {
- mainModule: import.meta.resolve("npm:@duckdb/duckdb-wasm/dist/duckdb-mvp.wasm"),
- mainWorker: import.meta.resolve("npm:@duckdb/duckdb-wasm/dist/duckdb-browser-mvp.worker.js")
- },
- eh: {
- mainModule: import.meta.resolve("npm:@duckdb/duckdb-wasm/dist/duckdb-eh.wasm"),
- mainWorker: import.meta.resolve("npm:@duckdb/duckdb-wasm/dist/duckdb-browser-eh.worker.js")
- }
-});
-
+const bundles = {
+ mvp: DUCKDB_MANIFEST.platforms.mvp
+ ? {
+ mainModule: import.meta.resolve("npm:@duckdb/duckdb-wasm/dist/duckdb-mvp.wasm"),
+ mainWorker: import.meta.resolve("npm:@duckdb/duckdb-wasm/dist/duckdb-browser-mvp.worker.js")
+ }
+ : undefined,
+ eh: DUCKDB_MANIFEST.platforms.eh
+ ? {
+ mainModule: import.meta.resolve("npm:@duckdb/duckdb-wasm/dist/duckdb-eh.wasm"),
+ mainWorker: import.meta.resolve("npm:@duckdb/duckdb-wasm/dist/duckdb-browser-eh.worker.js")
+ }
+ : undefined
+};
+const bundle = duckdb.selectBundle(bundles);
const logger = new duckdb.ConsoleLogger(duckdb.LogLevel.WARNING);
let db;
@@ -169,6 +174,7 @@ export class DuckDBClient {
config = {...config, query: {...config.query, castBigIntToDouble: true}};
}
await db.open(config);
+ await registerExtensions(db, config.extensions);
await Promise.all(Object.entries(sources).map(([name, source]) => insertSource(db, name, source)));
return new DuckDBClient(db);
}
@@ -178,9 +184,24 @@ export class DuckDBClient {
}
}
-Object.defineProperty(DuckDBClient.prototype, "dialect", {
- value: "duckdb"
-});
+Object.defineProperty(DuckDBClient.prototype, "dialect", {value: "duckdb"});
+
+async function registerExtensions(db, extensions) {
+ const {mainModule} = await bundle;
+ const platform = Object.keys(bundles).find((platform) => mainModule === bundles[platform].mainModule);
+ const con = await db.connect();
+ try {
+ await Promise.all(
+ Object.entries(DUCKDB_MANIFEST.extensions).map(([name, {load, [platform]: ref}]) =>
+ con
+ .query(`INSTALL "${name}" FROM '${import.meta.resolve(ref)}'`)
+ .then(() => (extensions === undefined ? load : extensions.includes(name)) && con.query(`LOAD "${name}"`))
+ )
+ );
+ } finally {
+ await con.close();
+ }
+}
async function insertSource(database, name, source) {
source = await source;
@@ -258,7 +279,7 @@ async function insertFile(database, name, file, options) {
});
}
if (/\.parquet$/i.test(file.name)) {
- const table = file.size < 10e6 ? "TABLE" : "VIEW"; // for small files, materialize the table
+ const table = file.size < 50e6 ? "TABLE" : "VIEW"; // for small files, materialize the table
return await connection.query(`CREATE ${table} '${name}' AS SELECT * FROM parquet_scan('${file.name}')`);
}
if (/\.(db|ddb|duckdb)$/i.test(file.name)) {
@@ -306,9 +327,10 @@ async function insertArray(database, name, array, options) {
}
async function createDuckDB() {
- const worker = await duckdb.createWorker(bundle.mainWorker);
+ const {mainWorker, mainModule} = await bundle;
+ const worker = await duckdb.createWorker(mainWorker);
const db = new duckdb.AsyncDuckDB(logger, worker);
- await db.instantiate(bundle.mainModule);
+ await db.instantiate(mainModule);
return db;
}
diff --git a/src/config.ts b/src/config.ts
index 126445a90..bf063bc46 100644
--- a/src/config.ts
+++ b/src/config.ts
@@ -8,6 +8,7 @@ import {pathToFileURL} from "node:url";
import he from "he";
import type MarkdownIt from "markdown-it";
import wrapAnsi from "wrap-ansi";
+import {DUCKDB_CORE_ALIASES, DUCKDB_CORE_EXTENSIONS} from "./duckdb.js";
import {visitFiles} from "./files.js";
import {formatIsoDate, formatLocaleDate} from "./format.js";
import type {FrontMatter} from "./frontMatter.js";
@@ -76,6 +77,23 @@ export interface SearchConfigSpec {
index?: unknown;
}
+export interface DuckDBConfig {
+ platforms: {[name: string]: true};
+ extensions: {[name: string]: DuckDBExtensionConfig};
+}
+
+export interface DuckDBExtensionConfig {
+ source: string;
+ install: boolean;
+ load: boolean;
+}
+
+interface DuckDBExtensionConfigSpec {
+ source: unknown;
+ install: unknown;
+ load: unknown;
+}
+
export interface Config {
root: string; // defaults to src
output: string; // defaults to dist
@@ -98,6 +116,7 @@ export interface Config {
normalizePath: (path: string) => string;
loaders: LoaderResolver;
watchPath?: string;
+ duckdb: DuckDBConfig;
}
export interface ConfigSpec {
@@ -124,7 +143,10 @@ export interface ConfigSpec {
typographer?: unknown;
quotes?: unknown;
cleanUrls?: unknown;
+ preserveIndex?: unknown;
+ preserveExtension?: unknown;
markdownIt?: unknown;
+ duckdb?: unknown;
}
interface ScriptSpec {
@@ -259,7 +281,8 @@ export function normalizeConfig(spec: ConfigSpec = {}, defaultRoot?: string, wat
const footer = pageFragment(spec.footer === undefined ? defaultFooter() : spec.footer);
const search = spec.search == null || spec.search === false ? null : normalizeSearch(spec.search as any);
const interpreters = normalizeInterpreters(spec.interpreters as any);
- const normalizePath = getPathNormalizer(spec.cleanUrls);
+ const normalizePath = getPathNormalizer(spec);
+ const duckdb = normalizeDuckDB(spec.duckdb);
// If this path ends with a slash, then add an implicit /index to the
// end of the path. Otherwise, remove the .html extension (we use clean
@@ -310,7 +333,8 @@ export function normalizeConfig(spec: ConfigSpec = {}, defaultRoot?: string, wat
md,
normalizePath,
loaders: new LoaderResolver({root, interpreters}),
- watchPath
+ watchPath,
+ duckdb
};
if (pages === undefined) Object.defineProperty(config, "pages", {get: () => readPages(root, md)});
if (sidebar === undefined) Object.defineProperty(config, "sidebar", {get: () => config.pages.length > 0});
@@ -324,13 +348,22 @@ function normalizeDynamicPaths(spec: unknown): Config["paths"] {
return async function* () { yield* paths; }; // prettier-ignore
}
-function getPathNormalizer(spec: unknown = true): (path: string) => string {
- const cleanUrls = Boolean(spec);
+function normalizeCleanUrls(spec: unknown): boolean {
+ console.warn(`${yellow("Warning:")} the ${bold("cleanUrls")} option is deprecated; use ${bold("preserveIndex")} and ${bold("preserveExtension")} instead.`); // prettier-ignore
+ return !spec;
+}
+
+function getPathNormalizer(spec: ConfigSpec): (path: string) => string {
+ const preserveIndex = spec.preserveIndex !== undefined ? Boolean(spec.preserveIndex) : false;
+ const preserveExtension = spec.preserveExtension !== undefined ? Boolean(spec.preserveExtension) : spec.cleanUrls !== undefined ? normalizeCleanUrls(spec.cleanUrls) : false; // prettier-ignore
return (path) => {
- if (path && !path.endsWith("/") && !extname(path)) path += ".html";
- if (path === "index.html") path = ".";
- else if (path.endsWith("/index.html")) path = path.slice(0, -"index.html".length);
- else if (cleanUrls) path = path.replace(/\.html$/, "");
+ const ext = extname(path);
+ if (path.endsWith(".")) path += "/";
+ if (ext === ".html") path = path.slice(0, -".html".length);
+ if (path.endsWith("/index")) path = path.slice(0, -"index".length);
+ if (preserveIndex && path.endsWith("/")) path += "index";
+ if (!preserveIndex && path === "index") path = ".";
+ if (preserveExtension && path && !path.endsWith(".") && !path.endsWith("/") && !extname(path)) path += ".html";
return path;
};
}
@@ -488,3 +521,49 @@ export function mergeStyle(
export function stringOrNull(spec: unknown): string | null {
return spec == null || spec === false ? null : String(spec);
}
+
+function normalizeDuckDB(spec: unknown): DuckDBConfig {
+ const {mvp = true, eh = true} = spec?.["platforms"] ?? {};
+ const extensions: {[name: string]: DuckDBExtensionConfig} = {};
+ let extspec: Record = spec?.["extensions"] ?? {};
+ if (Array.isArray(extspec)) extspec = Object.fromEntries(extspec.map((name) => [name, {}]));
+ if (extspec.json === undefined) extspec = {...extspec, json: false};
+ if (extspec.parquet === undefined) extspec = {...extspec, parquet: false};
+ for (let name in extspec) {
+ if (!/^\w+$/.test(name)) throw new Error(`invalid extension: ${name}`);
+ const vspec = extspec[name];
+ if (vspec == null) continue;
+ name = DUCKDB_CORE_ALIASES[name] ?? name;
+ const {
+ source = name in DUCKDB_CORE_EXTENSIONS ? "core" : "community",
+ install = true,
+ load = !DUCKDB_CORE_EXTENSIONS[name]
+ } = typeof vspec === "boolean"
+ ? {load: vspec}
+ : typeof vspec === "string"
+ ? {source: vspec}
+ : (vspec as DuckDBExtensionConfigSpec);
+ extensions[name] = {
+ source: normalizeDuckDBSource(String(source)),
+ install: Boolean(install),
+ load: Boolean(load)
+ };
+ }
+ return {
+ platforms: Object.fromEntries(
+ [
+ ["mvp", mvp],
+ ["eh", eh]
+ ].filter(([, enabled]) => enabled)
+ ),
+ extensions
+ };
+}
+
+function normalizeDuckDBSource(source: string): string {
+ if (source === "core") return "https://extensions.duckdb.org/";
+ if (source === "community") return "https://community-extensions.duckdb.org/";
+ const url = new URL(source);
+ if (url.protocol !== "https:") throw new Error(`invalid source: ${source}`);
+ return String(url);
+}
diff --git a/src/duckdb.ts b/src/duckdb.ts
new file mode 100644
index 000000000..bc4360a50
--- /dev/null
+++ b/src/duckdb.ts
@@ -0,0 +1,127 @@
+import {existsSync} from "node:fs";
+import {mkdir, writeFile} from "node:fs/promises";
+import {dirname, join} from "node:path/posix";
+import type {DuckDBConfig} from "./config.js";
+import {faint} from "./tty.js";
+
+const downloadRequests = new Map>();
+
+export const DUCKDB_WASM_VERSION = "1.29.0";
+export const DUCKDB_VERSION = "1.1.1";
+
+// https://duckdb.org/docs/extensions/core_extensions.html
+export const DUCKDB_CORE_ALIASES: Record = {
+ sqlite: "sqlite_scanner",
+ sqlite3: "sqlite_scanner",
+ postgres_scanner: "postgres",
+ http: "httpfs",
+ https: "httpfs",
+ s3: "httpfs"
+} as const;
+
+// https://duckdb.org/docs/extensions/core_extensions.html
+// https://duckdb.org/docs/api/wasm/extensions.html#list-of-officially-available-extensions
+export const DUCKDB_CORE_EXTENSIONS = {
+ arrow: false,
+ autocomplete: true,
+ aws: true,
+ azure: true,
+ delta: true,
+ excel: true,
+ fts: true,
+ httpfs: true,
+ iceberg: false,
+ icu: true,
+ inet: true,
+ jemalloc: false,
+ json: true,
+ mysql: false,
+ parquet: true,
+ postgres: true,
+ spatial: false,
+ sqlite_scanner: true,
+ substrait: false,
+ tpcds: true,
+ tpch: true,
+ vss: false
+} as const;
+
+export async function getDuckDBManifest(
+ {platforms, extensions}: DuckDBConfig,
+ {root, aliases}: {root: string; aliases?: Map}
+) {
+ return {
+ platforms: {mvp: "mvp" in platforms, eh: "eh" in platforms},
+ extensions: Object.fromEntries(
+ await Promise.all(
+ Object.entries(extensions).map(([name, {install, load, source}]) =>
+ (async () => [
+ name,
+ {
+ install,
+ load,
+ ...Object.fromEntries(
+ await Promise.all(
+ Object.keys(platforms).map(async (platform) => [
+ platform,
+ install
+ ? await getDuckDBExtension(root, resolveDuckDBExtension(source, platform, name), aliases)
+ : source
+ ])
+ )
+ )
+ }
+ ])()
+ )
+ )
+ )
+ };
+}
+
+export function resolveDuckDBExtension(repo: string, platform: string, name: string): URL {
+ return new URL(`v${DUCKDB_VERSION}/wasm_${platform}/${name}.duckdb_extension.wasm`, repo);
+}
+
+/**
+ * Returns the extension “custom repository” location as needed for DuckDB’s
+ * INSTALL command. This is the relative path to which DuckDB will implicitly add
+ * v{version}/wasm_{platform}/{name}.duckdb_extension.wasm, assuming that the
+ * manifest is baked into /_observablehq/stdlib/duckdb.js.
+ *
+ * https://duckdb.org/docs/extensions/working_with_extensions#creating-a-custom-repository
+ */
+async function getDuckDBExtension(root: string, href: string | URL, aliases?: Map) {
+ let ext = await cacheDuckDBExtension(root, href);
+ if (aliases?.has(ext)) ext = aliases.get(ext)!;
+ return join("..", "..", dirname(dirname(dirname(ext))));
+}
+
+/**
+ * Saves the given DuckDB extension to the .observablehq/cache/_duckdb cache,
+ * as {origin}/{path}/{name}.duckdb_extension.wasm, returning the serving path
+ * to the saved file in the cache (starting with /_duckdb).
+ *
+ * https://duckdb.org/docs/extensions/overview#installation-location
+ */
+export async function cacheDuckDBExtension(root: string, href: string | URL): Promise {
+ const url = new URL(href);
+ if (url.protocol !== "https:") throw new Error(`unsupported protocol: ${url.protocol}`);
+ const key = String(url).slice("https://".length);
+ const path = join("_duckdb", key);
+ const cache = join(root, ".observablehq", "cache");
+ const cachePath = join(cache, path);
+ if (existsSync(cachePath)) return `/${path}`;
+ let promise = downloadRequests.get(cachePath);
+ if (promise) return promise; // coalesce concurrent requests
+ promise = (async () => {
+ console.log(`duckdb:${key} ${faint("→")} ${cachePath}`);
+ const response = await fetch(url);
+ if (!response.ok) throw new Error(`unable to fetch: ${url}`);
+ await mkdir(dirname(cachePath), {recursive: true});
+ await writeFile(cachePath, Buffer.from(await response.arrayBuffer()));
+ return `/${path}`;
+ })();
+ promise.catch(console.error).then(() => downloadRequests.delete(cachePath));
+ downloadRequests.set(cachePath, promise);
+ return promise;
+}
diff --git a/src/javascript/annotate.ts b/src/javascript/annotate.ts
index 92cfe4510..2d2fb031e 100644
--- a/src/javascript/annotate.ts
+++ b/src/javascript/annotate.ts
@@ -1,14 +1,9 @@
import {isPathImport} from "../path.js";
-/**
- * Annotate a path to a local import or file so it can be reworked server-side.
- */
-
const annotate = process.env["OBSERVABLE_ANNOTATE_FILES"];
-if (typeof annotate === "string" && annotate !== "true")
- throw new Error(`unsupported OBSERVABLE_ANNOTATE_FILES value: ${annotate}`);
-export default annotate
- ? function (uri: string): string {
- return `${JSON.stringify(uri)}${isPathImport(uri) ? "/* observablehq-file */" : ""}`;
- }
+if (annotate && annotate !== "true") throw new Error(`unsupported OBSERVABLE_ANNOTATE_FILES: ${annotate}`);
+
+/** Annotate a path to a local import or file so it can be reworked server-side. */
+export const annotatePath = annotate
+ ? (uri: string) => `${JSON.stringify(uri)}${isPathImport(uri) ? "/* observablehq-file */" : ""}`
: JSON.stringify;
diff --git a/src/javascript/transpile.ts b/src/javascript/transpile.ts
index a4596753a..d68ebbb44 100644
--- a/src/javascript/transpile.ts
+++ b/src/javascript/transpile.ts
@@ -7,7 +7,7 @@ import {isPathImport, relativePath, resolvePath, resolveRelativePath} from "../p
import {getModuleResolver} from "../resolvers.js";
import type {Params} from "../route.js";
import {Sourcemap} from "../sourcemap.js";
-import annotate from "./annotate.js";
+import {annotatePath} from "./annotate.js";
import type {FileExpression} from "./files.js";
import {findFiles} from "./files.js";
import type {ExportNode, ImportNode} from "./imports.js";
@@ -102,7 +102,7 @@ export async function transpileModule(
async function rewriteImportSource(source: StringLiteral) {
const specifier = getStringLiteralValue(source);
- output.replaceLeft(source.start, source.end, annotate(await resolveImport(specifier)));
+ output.replaceLeft(source.start, source.end, annotatePath(await resolveImport(specifier)));
}
for (const {name, node} of findFiles(body, path, input)) {
@@ -116,7 +116,7 @@ export async function transpileModule(
info
? `{"name":${JSON.stringify(p)},"mimeType":${JSON.stringify(
mime.getType(name) ?? undefined
- )},"path":${annotate(relativePath(servePath, resolveFile(name)))},"lastModified":${JSON.stringify(
+ )},"path":${annotatePath(relativePath(servePath, resolveFile(name)))},"lastModified":${JSON.stringify(
info.mtimeMs
)},"size":${JSON.stringify(info.size)}}`
: JSON.stringify(p)
@@ -136,7 +136,7 @@ export async function transpileModule(
if (isImportMetaResolve(node) && isStringLiteral(source)) {
const value = getStringLiteralValue(source);
const resolution = isPathImport(value) && !isJavaScript(value) ? resolveFile(value) : await resolveImport(value);
- output.replaceLeft(source.start, source.end, annotate(resolution));
+ output.replaceLeft(source.start, source.end, annotatePath(resolution));
}
}
@@ -204,7 +204,7 @@ function rewriteImportDeclarations(
for (const node of declarations) {
output.delete(node.start, node.end + +(output.input[node.end] === "\n"));
specifiers.push(rewriteImportSpecifiers(node));
- imports.push(`import(${annotate(resolve(getStringLiteralValue(node.source as StringLiteral)))})`);
+ imports.push(`import(${annotatePath(resolve(getStringLiteralValue(node.source as StringLiteral)))})`);
}
if (declarations.length > 1) {
output.insertLeft(0, `const [${specifiers.join(", ")}] = await Promise.all([${imports.join(", ")}]);\n`);
diff --git a/src/libraries.ts b/src/libraries.ts
index 9c2cba98e..37ef60881 100644
--- a/src/libraries.ts
+++ b/src/libraries.ts
@@ -1,3 +1,6 @@
+import type {DuckDBConfig} from "./config.js";
+import {resolveDuckDBExtension} from "./duckdb.js";
+
export function getImplicitFileImports(methods: Iterable): Set {
const set = setof(methods);
const implicits = new Set();
@@ -72,7 +75,7 @@ export function getImplicitStylesheets(imports: Iterable): Set {
* library used by FileAttachment) we manually enumerate the needed additional
* downloads here. TODO Support versioned imports, too, such as "npm:leaflet@1".
*/
-export function getImplicitDownloads(imports: Iterable): Set {
+export function getImplicitDownloads(imports: Iterable, duckdb?: DuckDBConfig): Set {
const set = setof(imports);
const implicits = new Set();
if (set.has("npm:@observablehq/duckdb")) {
@@ -80,6 +83,12 @@ export function getImplicitDownloads(imports: Iterable): Set {
implicits.add("npm:@duckdb/duckdb-wasm/dist/duckdb-browser-mvp.worker.js");
implicits.add("npm:@duckdb/duckdb-wasm/dist/duckdb-eh.wasm");
implicits.add("npm:@duckdb/duckdb-wasm/dist/duckdb-browser-eh.worker.js");
+ if (!duckdb) throw new Error("Implementation error: missing duckdb configuration");
+ for (const [name, {source}] of Object.entries(duckdb.extensions)) {
+ for (const platform in duckdb.platforms) {
+ implicits.add(`duckdb:${resolveDuckDBExtension(source, platform, name)}`);
+ }
+ }
}
if (set.has("npm:@observablehq/sqlite")) {
implicits.add("npm:sql.js/dist/sql-wasm.js");
diff --git a/src/node.ts b/src/node.ts
index 0d8ebc7a1..bb2c340c2 100644
--- a/src/node.ts
+++ b/src/node.ts
@@ -13,7 +13,7 @@ import type {AstNode, OutputChunk, Plugin, ResolveIdResult} from "rollup";
import {rollup} from "rollup";
import esbuild from "rollup-plugin-esbuild";
import {prepareOutput, toOsPath} from "./files.js";
-import annotate from "./javascript/annotate.js";
+import {annotatePath} from "./javascript/annotate.js";
import type {ImportReference} from "./javascript/imports.js";
import {isJavaScript, parseImports} from "./javascript/imports.js";
import {parseNpmSpecifier, rewriteNpmImports} from "./npm.js";
@@ -87,7 +87,7 @@ function isBadCommonJs(specifier: string): boolean {
}
function shimCommonJs(specifier: string, require: NodeRequire): string {
- return `export {${Object.keys(require(specifier))}} from ${annotate(specifier)};\n`;
+ return `export {${Object.keys(require(specifier))}} from ${annotatePath(specifier)};\n`;
}
async function bundle(
diff --git a/src/npm.ts b/src/npm.ts
index b9fc174ca..553ac00de 100644
--- a/src/npm.ts
+++ b/src/npm.ts
@@ -4,8 +4,9 @@ import {dirname, extname, join} from "node:path/posix";
import type {CallExpression} from "acorn";
import {simple} from "acorn-walk";
import {maxSatisfying, rsort, satisfies, validRange} from "semver";
+import {DUCKDB_WASM_VERSION} from "./duckdb.js";
import {isEnoent} from "./error.js";
-import annotate from "./javascript/annotate.js";
+import {annotatePath} from "./javascript/annotate.js";
import type {ExportNode, ImportNode, ImportReference} from "./javascript/imports.js";
import {isImportMetaResolve, parseImports} from "./javascript/imports.js";
import {parseProgram} from "./javascript/parse.js";
@@ -65,7 +66,7 @@ export function rewriteNpmImports(input: string, resolve: (s: string) => string
const value = getStringLiteralValue(source);
const resolved = resolve(value);
if (resolved === undefined || value === resolved) return;
- output.replaceLeft(source.start, source.end, annotate(resolved));
+ output.replaceLeft(source.start, source.end, annotatePath(resolved));
}
// TODO Preserve the source map, but download it too.
@@ -163,7 +164,7 @@ export async function getDependencyResolver(
(name === "arquero" || name === "@uwdata/mosaic-core" || name === "@duckdb/duckdb-wasm") && depName === "apache-arrow" // prettier-ignore
? "latest" // force Arquero, Mosaic & DuckDB-Wasm to use the (same) latest version of Arrow
: name === "@uwdata/mosaic-core" && depName === "@duckdb/duckdb-wasm"
- ? "1.28.0" // force Mosaic to use the latest (stable) version of DuckDB-Wasm
+ ? DUCKDB_WASM_VERSION // force Mosaic to use the latest (stable) version of DuckDB-Wasm
: pkg.dependencies?.[depName] ??
pkg.devDependencies?.[depName] ??
pkg.peerDependencies?.[depName] ??
@@ -249,9 +250,7 @@ async function resolveNpmVersion(root: string, {name, range}: NpmSpecifier): Pro
export async function resolveNpmImport(root: string, specifier: string): Promise {
const {
name,
- range = name === "@duckdb/duckdb-wasm"
- ? "1.28.0" // https://github.com/duckdb/duckdb-wasm/issues/1561
- : undefined,
+ range = name === "@duckdb/duckdb-wasm" ? DUCKDB_WASM_VERSION : undefined,
path = name === "mermaid"
? "dist/mermaid.esm.min.mjs/+esm"
: name === "echarts"
diff --git a/src/preview.ts b/src/preview.ts
index 42f3c931a..4252f44a3 100644
--- a/src/preview.ts
+++ b/src/preview.ts
@@ -16,6 +16,7 @@ import type {WebSocket} from "ws";
import {WebSocketServer} from "ws";
import type {Config} from "./config.js";
import {readConfig} from "./config.js";
+import {getDuckDBManifest} from "./duckdb.js";
import {enoent, isEnoent, isHttpError, isSystemError} from "./error.js";
import {getClientPath} from "./files.js";
import type {FileWatchers} from "./fileWatchers.js";
@@ -118,7 +119,7 @@ export class PreviewServer {
_handleRequest: RequestListener = async (req, res) => {
const config = await this._readConfig();
- const {root, loaders} = config;
+ const {root, loaders, duckdb} = config;
if (this._verbose) console.log(faint(req.method!), req.url);
const url = new URL(req.url!, "http://localhost");
const {origin} = req.headers;
@@ -135,11 +136,15 @@ export class PreviewServer {
end(req, res, await bundleStyles({theme: match.groups!.theme?.split(",") ?? []}), "text/css");
} else if (pathname.startsWith("/_observablehq/") && pathname.endsWith(".js")) {
const path = getClientPath(pathname.slice("/_observablehq/".length));
- end(req, res, await rollupClient(path, root, pathname), "text/javascript");
+ const options =
+ pathname === "/_observablehq/stdlib/duckdb.js"
+ ? {define: {DUCKDB_MANIFEST: JSON.stringify(await getDuckDBManifest(duckdb, {root}))}}
+ : {};
+ end(req, res, await rollupClient(path, root, pathname, options), "text/javascript");
} else if (pathname.startsWith("/_observablehq/") && pathname.endsWith(".css")) {
const path = getClientPath(pathname.slice("/_observablehq/".length));
end(req, res, await bundleStyles({path}), "text/css");
- } else if (pathname.startsWith("/_node/") || pathname.startsWith("/_jsr/")) {
+ } else if (pathname.startsWith("/_node/") || pathname.startsWith("/_jsr/") || pathname.startsWith("/_duckdb/")) {
send(req, pathname, {root: join(root, ".observablehq", "cache")}).pipe(res);
} else if (pathname.startsWith("/_npm/")) {
await populateNpmCache(root, pathname);
@@ -176,8 +181,8 @@ export class PreviewServer {
} else {
if ((pathname = normalize(pathname)).startsWith("..")) throw new Error("Invalid path: " + pathname);
- // Normalize the pathname (e.g., adding ".html" if cleanUrls is false,
- // dropping ".html" if cleanUrls is true) and redirect if necessary.
+ // Normalize the pathname (e.g., adding ".html" or removing ".html"
+ // based on preserveExtension) and redirect if necessary.
const normalizedPathname = encodeURI(config.normalizePath(pathname));
if (url.pathname !== normalizedPathname) {
res.writeHead(302, {Location: normalizedPathname + url.search});
@@ -390,9 +395,9 @@ function handleWatch(socket: WebSocket, req: IncomingMessage, configPromise: Pro
if (path.endsWith("/")) path += "index";
path = join(dirname(path), basename(path, ".html"));
config = await configPromise;
- const {root, loaders, normalizePath} = config;
+ const {root, loaders, normalizePath, duckdb} = config;
const page = await loaders.loadPage(path, {path, ...config});
- const resolvers = await getResolvers(page, {root, path, loaders, normalizePath});
+ const resolvers = await getResolvers(page, {root, path, loaders, normalizePath, duckdb});
if (resolvers.hash === initialHash) send({type: "welcome"});
else return void send({type: "reload"});
hash = resolvers.hash;
diff --git a/src/resolvers.ts b/src/resolvers.ts
index 3e9bd49fd..600dfb196 100644
--- a/src/resolvers.ts
+++ b/src/resolvers.ts
@@ -1,5 +1,7 @@
import {createHash} from "node:crypto";
import {extname, join} from "node:path/posix";
+import type {DuckDBConfig} from "./config.js";
+import {cacheDuckDBExtension} from "./duckdb.js";
import {findAssets} from "./html.js";
import {defaultGlobals} from "./javascript/globals.js";
import {isJavaScript} from "./javascript/imports.js";
@@ -38,6 +40,7 @@ export interface ResolversConfig {
normalizePath: (path: string) => string;
globalStylesheets?: string[];
loaders: LoaderResolver;
+ duckdb: DuckDBConfig;
}
const defaultImports = [
@@ -202,7 +205,7 @@ async function resolveResolvers(
staticImports?: Iterable | null;
stylesheets?: Iterable | null;
},
- {root, path, normalizePath, loaders}: ResolversConfig
+ {root, path, normalizePath, loaders, duckdb}: ResolversConfig
): Promise> {
const files = new Set(initialFiles);
const fileMethods = new Set(initialFileMethods);
@@ -361,12 +364,15 @@ async function resolveResolvers(
// Add implicit downloads. (This should be maybe be stored separately rather
// than being tossed into global imports, but it works for now.)
- for (const specifier of getImplicitDownloads(globalImports)) {
+ for (const specifier of getImplicitDownloads(globalImports, duckdb)) {
globalImports.add(specifier);
if (specifier.startsWith("npm:")) {
const path = await resolveNpmImport(root, specifier.slice("npm:".length));
resolutions.set(specifier, path);
await populateNpmCache(root, path);
+ } else if (specifier.startsWith("duckdb:")) {
+ const path = await cacheDuckDBExtension(root, specifier.slice("duckdb:".length));
+ resolutions.set(specifier, path);
} else if (!specifier.startsWith("observablehq:")) {
throw new Error(`unhandled implicit download: ${specifier}`);
}
diff --git a/src/rollup.ts b/src/rollup.ts
index f603c4a04..6a41c5d46 100644
--- a/src/rollup.ts
+++ b/src/rollup.ts
@@ -6,7 +6,7 @@ import type {AstNode, OutputChunk, Plugin, ResolveIdResult} from "rollup";
import {rollup} from "rollup";
import esbuild from "rollup-plugin-esbuild";
import {getClientPath, getStylePath} from "./files.js";
-import annotate from "./javascript/annotate.js";
+import {annotatePath} from "./javascript/annotate.js";
import type {StringLiteral} from "./javascript/source.js";
import {getStringLiteralValue, isStringLiteral} from "./javascript/source.js";
import {resolveNpmImport} from "./npm.js";
@@ -178,7 +178,7 @@ function importMetaResolve(path: string, resolveImport: ImportResolver): Plugin
for (const source of resolves) {
const specifier = getStringLiteralValue(source);
const resolution = await resolveImport(specifier);
- if (resolution) output.replaceLeft(source.start, source.end, annotate(relativePath(path, resolution)));
+ if (resolution) output.replaceLeft(source.start, source.end, annotatePath(relativePath(path, resolution)));
}
return {code: String(output)};
diff --git a/templates/default/observablehq.config.js.tmpl b/templates/default/observablehq.config.js.tmpl
index bfa5ef704..21c860028 100644
--- a/templates/default/observablehq.config.js.tmpl
+++ b/templates/default/observablehq.config.js.tmpl
@@ -33,5 +33,6 @@ export default {
// search: true, // activate search
// linkify: true, // convert URLs in Markdown to links
// typographer: false, // smart quotes and other typographic improvements
- // cleanUrls: true, // drop .html from URLs
+ // preserveExtension: false, // drop .html from URLs
+ // preserveIndex: false, // drop /index from URLs
};
diff --git a/templates/empty/observablehq.config.js.tmpl b/templates/empty/observablehq.config.js.tmpl
index bfa5ef704..21c860028 100644
--- a/templates/empty/observablehq.config.js.tmpl
+++ b/templates/empty/observablehq.config.js.tmpl
@@ -33,5 +33,6 @@ export default {
// search: true, // activate search
// linkify: true, // convert URLs in Markdown to links
// typographer: false, // smart quotes and other typographic improvements
- // cleanUrls: true, // drop .html from URLs
+ // preserveExtension: false, // drop .html from URLs
+ // preserveIndex: false, // drop /index from URLs
};
diff --git a/test/build-test.ts b/test/build-test.ts
index d4245d460..e1ff0a392 100644
--- a/test/build-test.ts
+++ b/test/build-test.ts
@@ -8,6 +8,7 @@ import {ascending, difference} from "d3-array";
import type {BuildManifest} from "../src/build.js";
import {FileBuildEffects, build} from "../src/build.js";
import {normalizeConfig, readConfig, setCurrentDate} from "../src/config.js";
+import {mockDuckDB} from "./mocks/duckdb.js";
import {mockJsDelivr} from "./mocks/jsdelivr.js";
import {mockJsr} from "./mocks/jsr.js";
@@ -33,6 +34,7 @@ describe("build", () => {
before(() => setCurrentDate(new Date("2024-01-10T16:00:00")));
mockJsDelivr();
mockJsr();
+ mockDuckDB();
// Each sub-directory of test/input/build is a test case.
const inputRoot = "test/input/build";
diff --git a/test/config-test.ts b/test/config-test.ts
index 2203372bb..32360f734 100644
--- a/test/config-test.ts
+++ b/test/config-test.ts
@@ -1,9 +1,29 @@
import assert from "node:assert";
import {resolve} from "node:path";
import MarkdownIt from "markdown-it";
+import type {DuckDBConfig} from "../src/config.js";
import {normalizeConfig as config, mergeToc, readConfig, setCurrentDate} from "../src/config.js";
import {LoaderResolver} from "../src/loader.js";
+const DUCKDB_DEFAULTS: DuckDBConfig = {
+ platforms: {
+ eh: true,
+ mvp: true
+ },
+ extensions: {
+ json: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: false
+ },
+ parquet: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: false
+ }
+ }
+};
+
describe("readConfig(undefined, root)", () => {
before(() => setCurrentDate(new Date("2024-01-10T16:00:00")));
it("imports the config file at the specified root", async () => {
@@ -43,7 +63,8 @@ describe("readConfig(undefined, root)", () => {
footer:
'Built with Observable on Jan 10, 2024.',
search: null,
- watchPath: resolve("test/input/build/config/observablehq.config.js")
+ watchPath: resolve("test/input/build/config/observablehq.config.js"),
+ duckdb: DUCKDB_DEFAULTS
});
});
it("returns the default config if no config file is found", async () => {
@@ -71,7 +92,8 @@ describe("readConfig(undefined, root)", () => {
footer:
'Built with Observable on Jan 10, 2024.',
search: null,
- watchPath: undefined
+ watchPath: undefined,
+ duckdb: DUCKDB_DEFAULTS
});
});
});
@@ -187,7 +209,7 @@ describe("normalizeConfig(spec, root)", () => {
});
});
-describe("normalizePath(path) with {cleanUrls: false}", () => {
+describe("normalizePath(path) with {cleanUrls: false} (deprecated)", () => {
const root = "test/input";
const normalize = config({cleanUrls: false}, root).normalizePath;
it("appends .html to extension-less links", () => {
@@ -234,7 +256,7 @@ describe("normalizePath(path) with {cleanUrls: false}", () => {
});
});
-describe("normalizePath(path) with {cleanUrls: true}", () => {
+describe("normalizePath(path) with {cleanUrls: true} (deprecated)", () => {
const root = "test/input";
const normalize = config({cleanUrls: true}, root).normalizePath;
it("does not append .html to extension-less links", () => {
@@ -283,6 +305,156 @@ describe("normalizePath(path) with {cleanUrls: true}", () => {
});
});
+describe("normalizePath(path) with {preserveExtension: true}", () => {
+ const root = "test/input";
+ const normalize = config({preserveExtension: true}, root).normalizePath;
+ it("appends .html to extension-less links", () => {
+ assert.strictEqual(normalize("foo"), "foo.html");
+ });
+ it("does not append .html to extensioned links", () => {
+ assert.strictEqual(normalize("foo.png"), "foo.png");
+ assert.strictEqual(normalize("foo.html"), "foo.html");
+ assert.strictEqual(normalize("foo.md"), "foo.md");
+ });
+ it("preserves absolute paths", () => {
+ assert.strictEqual(normalize("/foo"), "/foo.html");
+ assert.strictEqual(normalize("/foo.html"), "/foo.html");
+ assert.strictEqual(normalize("/foo.png"), "/foo.png");
+ });
+ it("converts index links to directories", () => {
+ assert.strictEqual(normalize("foo/index"), "foo/");
+ assert.strictEqual(normalize("foo/index.html"), "foo/");
+ assert.strictEqual(normalize("../index"), "../");
+ assert.strictEqual(normalize("../index.html"), "../");
+ assert.strictEqual(normalize("./index"), "./");
+ assert.strictEqual(normalize("./index.html"), "./");
+ assert.strictEqual(normalize("/index"), "/");
+ assert.strictEqual(normalize("/index.html"), "/");
+ assert.strictEqual(normalize("index"), ".");
+ assert.strictEqual(normalize("index.html"), ".");
+ });
+ it("preserves links to directories", () => {
+ assert.strictEqual(normalize(""), "");
+ assert.strictEqual(normalize("/"), "/");
+ assert.strictEqual(normalize("./"), "./");
+ assert.strictEqual(normalize("../"), "../");
+ assert.strictEqual(normalize("foo/"), "foo/");
+ assert.strictEqual(normalize("./foo/"), "./foo/");
+ assert.strictEqual(normalize("../foo/"), "../foo/");
+ assert.strictEqual(normalize("../sub/"), "../sub/");
+ });
+ it("preserves a relative path", () => {
+ assert.strictEqual(normalize("foo"), "foo.html");
+ assert.strictEqual(normalize("./foo"), "./foo.html");
+ assert.strictEqual(normalize("../foo"), "../foo.html");
+ assert.strictEqual(normalize("./foo.png"), "./foo.png");
+ assert.strictEqual(normalize("../foo.png"), "../foo.png");
+ });
+});
+
+describe("normalizePath(path) with {preserveExtension: false}", () => {
+ const root = "test/input";
+ const normalize = config({preserveExtension: false}, root).normalizePath;
+ it("does not append .html to extension-less links", () => {
+ assert.strictEqual(normalize("foo"), "foo");
+ });
+ it("does not append .html to extensioned links", () => {
+ assert.strictEqual(normalize("foo.png"), "foo.png");
+ assert.strictEqual(normalize("foo.md"), "foo.md");
+ });
+ it("removes .html from extensioned links", () => {
+ assert.strictEqual(normalize("foo.html"), "foo");
+ });
+ it("preserves absolute paths", () => {
+ assert.strictEqual(normalize("/foo"), "/foo");
+ assert.strictEqual(normalize("/foo.html"), "/foo");
+ assert.strictEqual(normalize("/foo.png"), "/foo.png");
+ });
+ it("converts index links to directories", () => {
+ assert.strictEqual(normalize("foo/index"), "foo/");
+ assert.strictEqual(normalize("foo/index.html"), "foo/");
+ assert.strictEqual(normalize("../index"), "../");
+ assert.strictEqual(normalize("../index.html"), "../");
+ assert.strictEqual(normalize("./index"), "./");
+ assert.strictEqual(normalize("./index.html"), "./");
+ assert.strictEqual(normalize("/index"), "/");
+ assert.strictEqual(normalize("/index.html"), "/");
+ assert.strictEqual(normalize("index"), ".");
+ assert.strictEqual(normalize("index.html"), ".");
+ });
+ it("preserves links to directories", () => {
+ assert.strictEqual(normalize(""), "");
+ assert.strictEqual(normalize("/"), "/");
+ assert.strictEqual(normalize("./"), "./");
+ assert.strictEqual(normalize("../"), "../");
+ assert.strictEqual(normalize("foo/"), "foo/");
+ assert.strictEqual(normalize("./foo/"), "./foo/");
+ assert.strictEqual(normalize("../foo/"), "../foo/");
+ assert.strictEqual(normalize("../sub/"), "../sub/");
+ });
+ it("preserves a relative path", () => {
+ assert.strictEqual(normalize("foo"), "foo");
+ assert.strictEqual(normalize("./foo"), "./foo");
+ assert.strictEqual(normalize("../foo"), "../foo");
+ assert.strictEqual(normalize("./foo.png"), "./foo.png");
+ assert.strictEqual(normalize("../foo.png"), "../foo.png");
+ });
+});
+
+describe("normalizePath(path) with {preserveIndex: true}", () => {
+ const root = "test/input";
+ const normalize = config({preserveIndex: true}, root).normalizePath;
+ it("preserves index links", () => {
+ assert.strictEqual(normalize("foo/index"), "foo/index");
+ assert.strictEqual(normalize("foo/index.html"), "foo/index");
+ assert.strictEqual(normalize("../index"), "../index");
+ assert.strictEqual(normalize("../index.html"), "../index");
+ assert.strictEqual(normalize("./index"), "./index");
+ assert.strictEqual(normalize("./index.html"), "./index");
+ assert.strictEqual(normalize("/index"), "/index");
+ assert.strictEqual(normalize("/index.html"), "/index");
+ assert.strictEqual(normalize("index"), "index");
+ assert.strictEqual(normalize("index.html"), "index");
+ });
+ it("converts links to directories", () => {
+ assert.strictEqual(normalize(""), "");
+ assert.strictEqual(normalize("/"), "/index");
+ assert.strictEqual(normalize("./"), "./index");
+ assert.strictEqual(normalize("../"), "../index");
+ assert.strictEqual(normalize("foo/"), "foo/index");
+ assert.strictEqual(normalize("./foo/"), "./foo/index");
+ assert.strictEqual(normalize("../foo/"), "../foo/index");
+ assert.strictEqual(normalize("../sub/"), "../sub/index");
+ });
+});
+
+describe("normalizePath(path) with {preserveIndex: true, preserveExtension: true}", () => {
+ const root = "test/input";
+ const normalize = config({preserveIndex: true, preserveExtension: true}, root).normalizePath;
+ it("preserves index links", () => {
+ assert.strictEqual(normalize("foo/index"), "foo/index.html");
+ assert.strictEqual(normalize("foo/index.html"), "foo/index.html");
+ assert.strictEqual(normalize("../index"), "../index.html");
+ assert.strictEqual(normalize("../index.html"), "../index.html");
+ assert.strictEqual(normalize("./index"), "./index.html");
+ assert.strictEqual(normalize("./index.html"), "./index.html");
+ assert.strictEqual(normalize("/index"), "/index.html");
+ assert.strictEqual(normalize("/index.html"), "/index.html");
+ assert.strictEqual(normalize("index"), "index.html");
+ assert.strictEqual(normalize("index.html"), "index.html");
+ });
+ it("converts links to directories", () => {
+ assert.strictEqual(normalize(""), "");
+ assert.strictEqual(normalize("/"), "/index.html");
+ assert.strictEqual(normalize("./"), "./index.html");
+ assert.strictEqual(normalize("../"), "../index.html");
+ assert.strictEqual(normalize("foo/"), "foo/index.html");
+ assert.strictEqual(normalize("./foo/"), "./foo/index.html");
+ assert.strictEqual(normalize("../foo/"), "../foo/index.html");
+ assert.strictEqual(normalize("../sub/"), "../sub/index.html");
+ });
+});
+
describe("mergeToc(spec, toc)", () => {
const root = "test/input/build/config";
it("merges page- and project-level toc config", async () => {
@@ -295,3 +467,170 @@ describe("mergeToc(spec, toc)", () => {
assert.deepStrictEqual(mergeToc({}, toc), {label: "Contents", show: true});
});
});
+
+describe("normalizeConfig(duckdb)", () => {
+ const root = "";
+ it("uses the defaults", () => {
+ const {duckdb} = config({}, root);
+ assert.deepEqual(duckdb, DUCKDB_DEFAULTS);
+ });
+ it("supports install: false and load: false", () => {
+ const {duckdb} = config({duckdb: {extensions: {json: {install: false, load: false}}}}, root);
+ assert.deepEqual(duckdb.extensions, {
+ ...DUCKDB_DEFAULTS.extensions,
+ json: {
+ source: "https://extensions.duckdb.org/",
+ install: false,
+ load: false
+ }
+ });
+ });
+ it("supports null", () => {
+ const {duckdb} = config({duckdb: {extensions: {json: null}}}, root);
+ assert.deepEqual(
+ duckdb.extensions,
+ Object.fromEntries(Object.entries(DUCKDB_DEFAULTS.extensions).filter(([name]) => name !== "json"))
+ );
+ });
+ it("defaults load: false for known auto-loading extensions", () => {
+ const {duckdb} = config({duckdb: {extensions: {aws: {}}}}, root);
+ assert.deepEqual(duckdb.extensions, {
+ ...DUCKDB_DEFAULTS.extensions,
+ aws: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: false
+ }
+ });
+ });
+ it("defaults source: core for known core extensions", () => {
+ const {duckdb} = config({duckdb: {extensions: {mysql: {}}}}, root);
+ assert.deepEqual(duckdb.extensions, {
+ ...DUCKDB_DEFAULTS.extensions,
+ mysql: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: true
+ }
+ });
+ });
+ it("defaults source: community for unknown extensions", () => {
+ const {duckdb} = config({duckdb: {extensions: {h3: {}}}}, root);
+ assert.deepEqual(duckdb.extensions, {
+ ...DUCKDB_DEFAULTS.extensions,
+ h3: {
+ source: "https://community-extensions.duckdb.org/",
+ install: true,
+ load: true
+ }
+ });
+ });
+ it("supports core, community and https:// sources", () => {
+ const {duckdb} = config(
+ {
+ duckdb: {
+ extensions: {
+ foo: {source: "core"},
+ bar: {source: "community"},
+ baz: {source: "https://custom-domain"}
+ }
+ }
+ },
+ root
+ );
+ assert.deepEqual(duckdb.extensions, {
+ ...DUCKDB_DEFAULTS.extensions,
+ foo: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: true
+ },
+ bar: {
+ source: "https://community-extensions.duckdb.org/",
+ install: true,
+ load: true
+ },
+ baz: {
+ source: "https://custom-domain/", // URL normalization
+ install: true,
+ load: true
+ }
+ });
+ });
+ it("supports source: string shorthand", () => {
+ const {duckdb} = config(
+ {
+ duckdb: {
+ extensions: {
+ foo: "core",
+ bar: "community",
+ baz: "https://custom-domain"
+ }
+ }
+ },
+ root
+ );
+ assert.deepEqual(duckdb.extensions, {
+ ...DUCKDB_DEFAULTS.extensions,
+ foo: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: true
+ },
+ bar: {
+ source: "https://community-extensions.duckdb.org/",
+ install: true,
+ load: true
+ },
+ baz: {
+ source: "https://custom-domain/", // URL normalization
+ install: true,
+ load: true
+ }
+ });
+ });
+ it("supports load: boolean shorthand", () => {
+ const {duckdb} = config({duckdb: {extensions: {json: true, foo: true, bar: false}}}, root);
+ assert.deepEqual(duckdb.extensions, {
+ ...DUCKDB_DEFAULTS.extensions,
+ json: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: true
+ },
+ foo: {
+ source: "https://community-extensions.duckdb.org/",
+ install: true,
+ load: true
+ },
+ bar: {
+ source: "https://community-extensions.duckdb.org/",
+ install: true,
+ load: false
+ }
+ });
+ });
+ it("supports sources shorthand", () => {
+ const {duckdb} = config({duckdb: {extensions: ["spatial", "h3"]}}, root);
+ assert.deepEqual(duckdb.extensions, {
+ ...DUCKDB_DEFAULTS.extensions,
+ spatial: {
+ source: "https://extensions.duckdb.org/",
+ install: true,
+ load: true
+ },
+ h3: {
+ source: "https://community-extensions.duckdb.org/",
+ install: true,
+ load: true
+ }
+ });
+ });
+ it("rejects invalid names", () => {
+ assert.throws(() => config({duckdb: {extensions: {"*^/": true}}}, root), /invalid extension/i);
+ });
+ it("rejects invalid sources", () => {
+ assert.throws(() => config({duckdb: {extensions: {foo: "file:///path/to/extension"}}}, root), /invalid source/i);
+ assert.throws(() => config({duckdb: {extensions: {foo: "notasource"}}}, root), /invalid url/i);
+ });
+});
diff --git a/test/input/build/duckdb/index.md b/test/input/build/duckdb/index.md
new file mode 100644
index 000000000..cef205280
--- /dev/null
+++ b/test/input/build/duckdb/index.md
@@ -0,0 +1,5 @@
+# test DuckDB
+
+```sql
+SELECT 1;
+```
diff --git a/test/javascript/annotate.ts b/test/javascript/annotate.ts
index 17edaa25b..8c527b37d 100644
--- a/test/javascript/annotate.ts
+++ b/test/javascript/annotate.ts
@@ -1,7 +1,5 @@
-/**
- * This file is not suffixed with '-test'; it expects to run with an extra
- * OBSERVABLE_ANNOTATE_FILES=true environment variable.
- */
+// This file is not suffixed with '-test'; it expects to run with an extra
+// OBSERVABLE_ANNOTATE_FILES=true environment variable.
import assert from "node:assert";
import type {TranspileModuleOptions} from "../../src/javascript/transpile.js";
import {transpileModule} from "../../src/javascript/transpile.js";
diff --git a/test/libraries-test.ts b/test/libraries-test.ts
index 8ecbeeee5..12cba2e46 100644
--- a/test/libraries-test.ts
+++ b/test/libraries-test.ts
@@ -53,7 +53,7 @@ describe("getImplicitStylesheets(imports)", () => {
describe("getImplicitDownloads(imports)", () => {
it("supports known imports", () => {
assert.deepStrictEqual(
- getImplicitDownloads(["npm:@observablehq/duckdb"]),
+ getImplicitDownloads(["npm:@observablehq/duckdb"], {extensions: {}, platforms: {mvp: true, eh: true}}),
new Set([
"npm:@duckdb/duckdb-wasm/dist/duckdb-mvp.wasm",
"npm:@duckdb/duckdb-wasm/dist/duckdb-browser-mvp.worker.js",
diff --git a/test/mocks/duckdb.ts b/test/mocks/duckdb.ts
new file mode 100644
index 000000000..525400fd5
--- /dev/null
+++ b/test/mocks/duckdb.ts
@@ -0,0 +1,17 @@
+import {getCurrentAgent, mockAgent} from "./undici.js";
+
+export function mockDuckDB() {
+ mockAgent();
+ before(async () => {
+ const agent = getCurrentAgent();
+ const client = agent.get("https://extensions.duckdb.org");
+ for (const p of ["mvp", "eh"]) {
+ for (const name of ["json", "parquet"]) {
+ client
+ .intercept({path: `/v1.1.1/wasm_${p}/${name}.duckdb_extension.wasm`, method: "GET"})
+ .reply(200, "", {headers: {"content-type": "application/wasm"}})
+ .persist();
+ }
+ }
+ });
+}
diff --git a/test/mocks/jsdelivr.ts b/test/mocks/jsdelivr.ts
index 70119be4b..597823d74 100644
--- a/test/mocks/jsdelivr.ts
+++ b/test/mocks/jsdelivr.ts
@@ -1,7 +1,7 @@
import {getCurrentAgent, mockAgent} from "./undici.js";
const packages: [name: string, {version: string; contents?: string; dependencies?: Record}][] = [
- ["@duckdb/duckdb-wasm", {version: "1.28.0"}],
+ ["@duckdb/duckdb-wasm", {version: "1.29.0"}],
["@example/url-import", {version: "1.0.0", contents: "import('https://example.com');"}],
["@observablehq/inputs", {version: "0.10.6"}],
["@observablehq/plot", {version: "0.6.11"}],
diff --git a/test/output/build/npm/_observablehq/stdlib/inputs.00000005.js b/test/output/build/duckdb/_duckdb/json-e3b0c442/v1.1.1/wasm_eh/json.duckdb_extension.wasm
similarity index 100%
rename from test/output/build/npm/_observablehq/stdlib/inputs.00000005.js
rename to test/output/build/duckdb/_duckdb/json-e3b0c442/v1.1.1/wasm_eh/json.duckdb_extension.wasm
diff --git a/test/output/build/npm/_observablehq/stdlib/inputs.00000006.css b/test/output/build/duckdb/_duckdb/json-e3b0c442/v1.1.1/wasm_mvp/json.duckdb_extension.wasm
similarity index 100%
rename from test/output/build/npm/_observablehq/stdlib/inputs.00000006.css
rename to test/output/build/duckdb/_duckdb/json-e3b0c442/v1.1.1/wasm_mvp/json.duckdb_extension.wasm
diff --git a/test/output/build/duckdb/_duckdb/parquet-e3b0c442/v1.1.1/wasm_eh/parquet.duckdb_extension.wasm b/test/output/build/duckdb/_duckdb/parquet-e3b0c442/v1.1.1/wasm_eh/parquet.duckdb_extension.wasm
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_duckdb/parquet-e3b0c442/v1.1.1/wasm_mvp/parquet.duckdb_extension.wasm b/test/output/build/duckdb/_duckdb/parquet-e3b0c442/v1.1.1/wasm_mvp/parquet.duckdb_extension.wasm
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/cd372fb8.js b/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/cd372fb8.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/dist/duckdb-browser-eh.worker.cd372fb8.js b/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/dist/duckdb-browser-eh.worker.cd372fb8.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/dist/duckdb-browser-mvp.worker.cd372fb8.js b/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/dist/duckdb-browser-mvp.worker.cd372fb8.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/dist/duckdb-eh.wasm b/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/dist/duckdb-eh.wasm
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/dist/duckdb-mvp.wasm b/test/output/build/duckdb/_npm/@duckdb/duckdb-wasm@1.29.0/dist/duckdb-mvp.wasm
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_npm/htl@0.3.1/cd372fb8.js b/test/output/build/duckdb/_npm/htl@0.3.1/cd372fb8.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_npm/isoformat@0.2.1/cd372fb8.js b/test/output/build/duckdb/_npm/isoformat@0.2.1/cd372fb8.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_observablehq/client.00000001.js b/test/output/build/duckdb/_observablehq/client.00000001.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_observablehq/runtime.00000002.js b/test/output/build/duckdb/_observablehq/runtime.00000002.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_observablehq/stdlib.00000003.js b/test/output/build/duckdb/_observablehq/stdlib.00000003.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_observablehq/stdlib/duckdb.00000005.js b/test/output/build/duckdb/_observablehq/stdlib/duckdb.00000005.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_observablehq/stdlib/inputs.00000006.css b/test/output/build/duckdb/_observablehq/stdlib/inputs.00000006.css
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_observablehq/stdlib/inputs.00000007.js b/test/output/build/duckdb/_observablehq/stdlib/inputs.00000007.js
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/_observablehq/theme-air,near-midnight.00000004.css b/test/output/build/duckdb/_observablehq/theme-air,near-midnight.00000004.css
new file mode 100644
index 000000000..e69de29bb
diff --git a/test/output/build/duckdb/index.html b/test/output/build/duckdb/index.html
new file mode 100644
index 000000000..9028d12f2
--- /dev/null
+++ b/test/output/build/duckdb/index.html
@@ -0,0 +1,48 @@
+
+
+
+
+
+
+test DuckDB
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+