Skip to content

Experiment: use JSDoc types in JS runtime#1218

Draft
jiribenes wants to merge 7 commits intomainfrom
experiment/jsdoc-types
Draft

Experiment: use JSDoc types in JS runtime#1218
jiribenes wants to merge 7 commits intomainfrom
experiment/jsdoc-types

Conversation

@jiribenes
Copy link
Copy Markdown
Contributor

Most of the contents are scaffolding, what this really adds is:

  1. types in libraries/js/effekt_*.js using the JSDoc syntax
  2. a // @ts-check comment at the top of the file

This means that we get some light form of type checking when investigating the JS output in VSCode.
It's especially useful when writing FFI. :)

I have not tested this properly, I really only tried it on two examples.

To me, this is just a first step: the type annotations currently present are a combination of my own vibes and of Claude's ideas of what the types should be. There's also a lot of @ts-ignore comments and casts to any and similar things to make TS shut up, this could likely be much more improved, hence the draft status of this PR.

@jiribenes
Copy link
Copy Markdown
Contributor Author

There are still a few spurious errors like the following, but it seems that it works fine now:
image

@jiribenes
Copy link
Copy Markdown
Contributor Author

But the types propagate somewhat well if there aren't any local functions in the way -- this is from parsing_dollars:
image

jiribenes added a commit that referenced this pull request Mar 23, 2026
Previously, we:
1. for each `var`, we allocated a new closure for its own `set` method
2. for each `RESET`, we allocated two closures for their `fresh` and
`arena` methods

... what if we, uh, didn't do that?

---

This could have been resolved both with prototypes and classes, but I
chose to use classes since intent is important and classes are better
recognisable by tooling. (and by JSDoc which is relevant for #1218)

The benchmark results are pretty strong: geomean is like a **4-5%
speedup** on my machine, and _all benchmarks_ got faster, yet!

These are the biggest changes (ran with hyperfine, warmup = 5, N ≥ 10,
inputs from the `config_js.txt` file), everything else is negligible as
per the Welch t-test (using mean, stddev & number of runs; the tests
where hyperfine complained about outliers are deemed as not
significant):

Benchmark | main after #1330 | this PR | Δ
-- | -- | -- | --
mandelbrot | 177 ms | 108 ms | −38.8%
sieve | 150 ms | 134 ms | −10.6%
bounce | 120 ms | 114 ms | −5.2%
storage | 690 ms | 672 ms | −2.5%
product_early | 248 ms | 241 ms | −2.7%
triples | 375 ms | 366 ms | −2.4%
dyck_one | 404 ms | 385 ms | −4.8%
number_matrix | 357 ms | 344 ms | −3.7%
financial_format | 294 ms | 284 ms | −3.4%
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant