Compare commits

...

36 Commits

Author SHA1 Message Date
e5515a4c05 Do we really need TokenType to go into TS 2023-09-26 11:07:54 -05:00
0bc685b0c4 Bump tungstenite from 0.20.0 to 0.20.1 in /src/wasm-lib/kcl/fuzz (#709)
Bumps [tungstenite](https://github.com/snapview/tungstenite-rs) from 0.20.0 to 0.20.1.
- [Changelog](https://github.com/snapview/tungstenite-rs/blob/master/CHANGELOG.md)
- [Commits](https://github.com/snapview/tungstenite-rs/compare/v0.20.0...v0.20.1)

---
updated-dependencies:
- dependency-name: tungstenite
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-25 20:49:16 -07:00
9ee032771a unused dep (#710) 2023-09-26 03:22:05 +00:00
c307ddd1b1 resize (#706)
* start of resize

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* refactor

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* check if 0

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* will work w new lib

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* new types

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* handle resize effect

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
Co-authored-by: Kurt Hutten Irev-Dev <k.hutten@protonmail.ch>
2023-09-25 19:49:53 -07:00
a30818ff2b fixes negative args in function (#707)
* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
2023-09-25 15:25:58 -07:00
53e763d938 fix close arc (#704)
* fix close arc

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* much bigger radius

Signed-off-by: Jess Frazelle <github@jessfraz.com>

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
2023-09-25 12:14:41 -07:00
8f74cd1d0c Bump tauri-plugin-fs-extra from 0190f68 to b04bde3 in /src-tauri (#702)
Bumps [tauri-plugin-fs-extra](https://github.com/tauri-apps/plugins-workspace) from `0190f68` to `b04bde3`.
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](0190f68f1d...b04bde3461)

---
updated-dependencies:
- dependency-name: tauri-plugin-fs-extra
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-25 09:33:53 -07:00
c271942897 remove errors (#703) 2023-09-25 07:28:03 +00:00
a03d09b41d Restructure tokenizer module (#700)
* Remove duplicated tests

These tests already were copied to tokeniser2.rs, so removing them doesn't affect code coverage.

* Move tokeniser to its own module

Now there's a module for tokens, and the tokenizer/lexer implementation is private within the token module.
2023-09-24 20:01:17 -05:00
2971b7752b Bump rust websocket libraries (#701)
Changelog: https://github.com/snapview/tokio-tungstenite/blob/master/CHANGELOG.md#0201
2023-09-24 21:34:31 +00:00
70e99eb00b Refactor is_code_token into a method (#699)
* Refactor is_code_token into a method

* Fix typos, use Parser as it was imported
2023-09-24 21:11:36 +00:00
5c66af59d2 New tokenizer based on winnow (#697)
* New tokenizer, using Winnow instead of regexes.

Between 1.3x and 4.4x speedup on lexer benchmarks :)

* Use dispatch instead of alt

Most of the time, if you know the first character of a token, you can narrow down its potential possible token types, instead of just trying each token type until one succeeds.

This further speeds up the lexer. Compared to main, this branch is now between 3x and 12x faster than main.
2023-09-22 21:57:39 -05:00
6dda6daeef Use separate benchmarks for lexing and parsing (#698) 2023-09-23 02:01:18 +00:00
b5387f1220 Cut release v0.9.1 (#693)
* bump to ️v0.9.1️

* update bump instructions

* readme update

* read me again

* change pr convention
2023-09-22 10:38:17 +10:00
fd5921b366 Convert the lexer to be iterative not recursive (#691)
This is often more memory-efficient (does not create a bunch of stack
frames)
2023-09-21 19:19:08 -05:00
716ad938fc stop gap for large files making editor slow (#690)
Signed-off-by: Jess Frazelle <github@jessfraz.com>
2023-09-21 16:13:22 -07:00
40136eb392 Bump kittycad from 0.2.25 to 0.2.26 in /src-tauri (#680)
Bumps [kittycad](https://github.com/KittyCAD/kittycad.rs) from 0.2.25 to 0.2.26.
- [Release notes](https://github.com/KittyCAD/kittycad.rs/releases)
- [Commits](https://github.com/KittyCAD/kittycad.rs/compare/v0.2.25...v0.2.26)

---
updated-dependencies:
- dependency-name: kittycad
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 14:05:55 -07:00
8d2b89fcd1 Bump openapitor from 0d121f6 to 61a1605 in /src/wasm-lib (#679)
Bumps [openapitor](https://github.com/KittyCAD/kittycad.rs) from `0d121f6` to `61a1605`.
- [Release notes](https://github.com/KittyCAD/kittycad.rs/releases)
- [Commits](0d121f6881...61a16059b3)

---
updated-dependencies:
- dependency-name: openapitor
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:22:04 -05:00
ad9fba3390 Bump tauri-plugin-fs-extra from 76832e6 to 0190f68 in /src-tauri (#681)
Bumps [tauri-plugin-fs-extra](https://github.com/tauri-apps/plugins-workspace) from `76832e6` to `0190f68`.
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](76832e60bf...0190f68f1d)

---
updated-dependencies:
- dependency-name: tauri-plugin-fs-extra
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:21:08 -05:00
911c43af50 Bump phonenumber from 0.3.2+8.13.9 to 0.3.3+8.13.9 in /src-tauri (#687)
Bumps [phonenumber](https://github.com/whisperfish/rust-phonenumber) from 0.3.2+8.13.9 to 0.3.3+8.13.9.
- [Commits](https://github.com/whisperfish/rust-phonenumber/commits)

---
updated-dependencies:
- dependency-name: phonenumber
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:02:37 -05:00
ab4e04f6c2 Bump phonenumber from 0.3.2+8.13.9 to 0.3.3+8.13.9 in /src/wasm-lib (#685)
Bumps [phonenumber](https://github.com/whisperfish/rust-phonenumber) from 0.3.2+8.13.9 to 0.3.3+8.13.9.
- [Commits](https://github.com/whisperfish/rust-phonenumber/commits)

---
updated-dependencies:
- dependency-name: phonenumber
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:02:23 -05:00
94aef05f74 Bump phonenumber from 0.3.2+8.13.9 to 0.3.3+8.13.9 in /src/wasm-lib/kcl/fuzz (#686)
Bump phonenumber in /src/wasm-lib/kcl/fuzz

Bumps [phonenumber](https://github.com/whisperfish/rust-phonenumber) from 0.3.2+8.13.9 to 0.3.3+8.13.9.
- [Commits](https://github.com/whisperfish/rust-phonenumber/commits)

---
updated-dependencies:
- dependency-name: phonenumber
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:02:10 -05:00
d820cf2446 Tokenizer is accidentally quadratic (#689)
* Add comments and rename a function

* Typo: paran -> paren

* Use bytes, not string, for the tokenizer

* Fix typo
2023-09-21 14:18:42 -05:00
0c724c4971 Start to restructure the Engine's connection to the backend (#674)
* Start to restructure the Engine's connectio to the backend

1) Add in a tearDown stub for when the Engine is torn down. This is now
   distinct from a 'close', which will not stop connect from trying
   again. Running tearDown will mark the connection to not be retried
   and close active connections.

2) Move the retry logic out of connect and into the constructor. It will
   attempt to reconnect at the same rate as we had previously.

3) The timeout will now only close the connection, not restart it.

Signed-off-by: Paul Tagliamonte <paul@kittycad.io>

* Don't continue on dead conn & setTimeout on init only

* Clean up extra setTimeout

* Keep track of connection timeouts and clear on close

* Fix tsc by defining Timeout

Signed-off-by: Paul Tagliamonte <paul@kittycad.io>

* appease the format gods

---------

Signed-off-by: Paul Tagliamonte <paul@kittycad.io>
Co-authored-by: Adam Sunderland <adam@kittycad.io>
2023-09-21 12:07:47 -04:00
b54ac4a694 improve getNodePathFromSourceRange and therefore the ast explorer aswell (#683)
improve getNodePathFromSourceRange and therefore the ast explorer as well
2023-09-21 05:40:41 +00:00
27227092b1 app stuck on blur when engine errors (#682)
* temp fix for when engine returns error

* don't add extrued to show function
2023-09-21 04:32:47 +00:00
04e1b92a5b Add a benchmark for parsing pipes-on-pipes (#678) 2023-09-21 03:13:07 +00:00
0553cd4621 tests for big files (#675)
* shit;

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* cleanup

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* u[dates;

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* still ignore the big one

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* Add big kitt SVG to benchmarks

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
Co-authored-by: Adam Chalmers <adam.s.chalmers@gmail.com>
2023-09-20 19:35:37 -07:00
61a0c88af4 Add IDE dirs to .gitignore (#676) 2023-09-21 02:03:09 +00:00
d5b0544437 Bump tauri-plugin-fs-extra from 5b814f5 to 76832e6 in /src-tauri (#657)
Bumps [tauri-plugin-fs-extra](https://github.com/tauri-apps/plugins-workspace) from `5b814f5` to `76832e6`.
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](5b814f56e6...76832e60bf)

---
updated-dependencies:
- dependency-name: tauri-plugin-fs-extra
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-20 18:47:38 -07:00
6cc8af5c23 make stdlib functions async (#672)
* wip

Signed-off-by: Jess Frazelle <github@jessfraz.com>

updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

closer

Signed-off-by: Jess Frazelle <github@jessfraz.com>

fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* closer

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* closer

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* compiles

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* connection

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fix wasm

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* timeout

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* remove the drop

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* drop handle

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fix

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
2023-09-20 18:27:08 -07:00
888104080e bump v0.9.0 (#673) 2023-09-21 10:38:40 +10:00
b6769889e3 Handle relative paths at kcl level (#506)
* handle relative paths at kcl level

* fmt

* update kittycad

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* bump

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fix tests

Signed-off-by: Jess Frazelle <github@jessfraz.com>

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
Co-authored-by: Jess Frazelle <github@jessfraz.com>
Co-authored-by: Jess Frazelle <jessfraz@users.noreply.github.com>
2023-09-21 10:36:26 +10:00
a32258dac4 Engine manager can be cloned (#671) 2023-09-20 16:22:47 -07:00
18dbbad244 Use an actor to manage the Tokio engine connection (#669)
* Use an actor to manage the Tokio engine connection

This means EngineManager trait's methods take &self not &mut self, and the tokio implementation can be cloned.

* Clean up code
2023-09-20 16:59:03 -05:00
b67c16cc9d Benchmark for KCL parser (#664)
* KCL benchmarks

* CI for benchmarks

* More specific name for benchmark

* Benchmark the right directory

* Format
2023-09-20 13:15:28 -05:00
76 changed files with 3800 additions and 1822 deletions

View File

@ -54,4 +54,4 @@ jobs:
- name: Run clippy - name: Run clippy
run: | run: |
cd "${{ matrix.dir }}" cd "${{ matrix.dir }}"
cargo clippy --all --tests -- -D warnings cargo clippy --all --tests --benches -- -D warnings

37
.github/workflows/cargo-criterion.yml vendored Normal file
View File

@ -0,0 +1,37 @@
on:
push:
branches:
- main
paths:
- '**.rs'
- '**/Cargo.toml'
- '**/Cargo.lock'
- '**/rust-toolchain.toml'
- .github/workflows/cargo-criterion.yml
pull_request:
paths:
- '**.rs'
- '**/Cargo.toml'
- '**/Cargo.lock'
- '**/rust-toolchain.toml'
- .github/workflows/cargo-criterion.yml
workflow_dispatch:
permissions: read-all
name: cargo criterion
jobs:
cargocriterion:
name: cargo criterion
runs-on: ubuntu-latest-8-cores
steps:
- uses: actions/checkout@v4
- uses: dtolnay/rust-toolchain@stable
- name: Install dependencies
run: |
cargo install cargo-criterion
- name: Rust Cache
uses: Swatinem/rust-cache@v2.6.1
- name: Benchmark kcl library
shell: bash
run: |-
cd src/wasm-lib/kcl; cargo criterion

View File

@ -58,4 +58,5 @@ jobs:
cargo nextest run --workspace --no-fail-fast -P ci cargo nextest run --workspace --no-fail-fast -P ci
env: env:
KITTYCAD_API_TOKEN: ${{secrets.KITTYCAD_API_TOKEN}} KITTYCAD_API_TOKEN: ${{secrets.KITTYCAD_API_TOKEN}}
RUST_MIN_STACK: 10485760000

5
.gitignore vendored
View File

@ -22,6 +22,11 @@ npm-debug.log*
yarn-debug.log* yarn-debug.log*
yarn-error.log* yarn-error.log*
.idea
.vscode
src/wasm-lib/.idea
src/wasm-lib/.vscode
# rust # rust
src/wasm-lib/target src/wasm-lib/target
src/wasm-lib/bindings src/wasm-lib/bindings

View File

@ -123,13 +123,24 @@ Before you submit a contribution PR to this repo, please ensure that:
## Release a new version ## Release a new version
1. Bump the versions in the .json files by creating a `Bump to v{x}.{y}.{z}` PR, committing the changes from 1. Bump the versions in the .json files by creating a `Cut release v{x}.{y}.{z}` PR, committing the changes from
```bash ```bash
VERSION=x.y.z yarn run bump-jsons VERSION=x.y.z yarn run bump-jsons
``` ```
The PR may serve as a place to discuss the human-readable changelog and extra QA. The PR may serve as a place to discuss the human-readable changelog and extra QA. A quick way of getting PR's merged since the last bump is to [use this PR filter](https://github.com/KittyCAD/modeling-app/pulls?q=is%3Apr+sort%3Aupdated-desc+is%3Amerged+), open up the browser console and past in the following
```typescript
console.log(
'- ' +
Array.from(
document.querySelectorAll('[data-hovercard-type="pull_request"]')
).map((a) => `[${a.innerText}](${a.href})`).join(`
- `)
)
```
grab the md list and delete any that are older than the last bump
2. Merge the PR 2. Merge the PR

View File

@ -1,6 +1,6 @@
{ {
"name": "untitled-app", "name": "untitled-app",
"version": "0.8.2", "version": "0.9.1",
"private": true, "private": true,
"dependencies": { "dependencies": {
"@codemirror/autocomplete": "^6.9.0", "@codemirror/autocomplete": "^6.9.0",
@ -10,7 +10,7 @@
"@fortawesome/react-fontawesome": "^0.2.0", "@fortawesome/react-fontawesome": "^0.2.0",
"@headlessui/react": "^1.7.13", "@headlessui/react": "^1.7.13",
"@headlessui/tailwindcss": "^0.2.0", "@headlessui/tailwindcss": "^0.2.0",
"@kittycad/lib": "^0.0.37", "@kittycad/lib": "^0.0.39",
"@lezer/javascript": "^1.4.7", "@lezer/javascript": "^1.4.7",
"@open-rpc/client-js": "^1.8.1", "@open-rpc/client-js": "^1.8.1",
"@react-hook/resize-observer": "^1.2.6", "@react-hook/resize-observer": "^1.2.6",
@ -27,6 +27,7 @@
"@uiw/react-codemirror": "^4.21.13", "@uiw/react-codemirror": "^4.21.13",
"@xstate/react": "^3.2.2", "@xstate/react": "^3.2.2",
"crypto-js": "^4.1.1", "crypto-js": "^4.1.1",
"debounce-promise": "^3.1.2",
"formik": "^2.4.3", "formik": "^2.4.3",
"fuse.js": "^6.6.2", "fuse.js": "^6.6.2",
"http-server": "^14.1.1", "http-server": "^14.1.1",
@ -65,7 +66,7 @@
"pretest": "yarn remove-importmeta", "pretest": "yarn remove-importmeta",
"test": "vitest --mode development", "test": "vitest --mode development",
"test:nowatch": "vitest run --mode development", "test:nowatch": "vitest run --mode development",
"test:rust": "(cd src/wasm-lib && cargo test --all && cargo clippy --all --tests)", "test:rust": "(cd src/wasm-lib && cargo test --all && cargo clippy --all --tests --benches)",
"test:cov": "vitest run --coverage --mode development", "test:cov": "vitest run --coverage --mode development",
"simpleserver:ci": "yarn pretest && http-server ./public --cors -p 3000 &", "simpleserver:ci": "yarn pretest && http-server ./public --cors -p 3000 &",
"simpleserver": "yarn pretest && http-server ./public --cors -p 3000", "simpleserver": "yarn pretest && http-server ./public --cors -p 3000",
@ -101,7 +102,7 @@
"@babel/preset-env": "^7.22.9", "@babel/preset-env": "^7.22.9",
"@tauri-apps/cli": "^1.3.1", "@tauri-apps/cli": "^1.3.1",
"@types/crypto-js": "^4.1.1", "@types/crypto-js": "^4.1.1",
"@types/debounce": "^1.2.1", "@types/debounce-promise": "^3.1.6",
"@types/isomorphic-fetch": "^0.0.36", "@types/isomorphic-fetch": "^0.0.36",
"@types/react-modal": "^3.16.0", "@types/react-modal": "^3.16.0",
"@types/uuid": "^9.0.1", "@types/uuid": "^9.0.1",

78
src-tauri/Cargo.lock generated
View File

@ -155,6 +155,20 @@ version = "0.21.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "604178f6c5c21f02dc555784810edfb88d34ac2c73b2eae109655649ee73ce3d" checksum = "604178f6c5c21f02dc555784810edfb88d34ac2c73b2eae109655649ee73ce3d"
[[package]]
name = "bigdecimal"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "454bca3db10617b88b566f205ed190aedb0e0e6dd4cad61d3988a72e8c5594cb"
dependencies = [
"autocfg",
"libm",
"num-bigint",
"num-integer",
"num-traits",
"serde",
]
[[package]] [[package]]
name = "bincode" name = "bincode"
version = "1.3.3" version = "1.3.3"
@ -1630,13 +1644,14 @@ dependencies = [
[[package]] [[package]]
name = "kittycad" name = "kittycad"
version = "0.2.25" version = "0.2.26"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9cf962b1e81a0b4eb923a727e761b40672cbacc7f5f0b75e13579d346352bc7" checksum = "e2623ee601ce203476229df3f9d3a14664cb43e3f7455e9ac8ed91aacaa6163d"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"async-trait", "async-trait",
"base64 0.21.2", "base64 0.21.2",
"bigdecimal",
"bytes", "bytes",
"chrono", "chrono",
"data-encoding", "data-encoding",
@ -1688,6 +1703,12 @@ version = "0.2.148"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9cdc71e17332e86d2e1d38c1f99edcb6288ee11b815fb1a4b049eaa2114d369b" checksum = "9cdc71e17332e86d2e1d38c1f99edcb6288ee11b815fb1a4b049eaa2114d369b"
[[package]]
name = "libm"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7012b1bbb0719e1097c47611d3898568c546d597c2e74d66f6087edd5233ff4"
[[package]] [[package]]
name = "line-wrap" name = "line-wrap"
version = "0.1.1" version = "0.1.1"
@ -1959,6 +1980,17 @@ dependencies = [
"winapi", "winapi",
] ]
[[package]]
name = "num-bigint"
version = "0.4.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "608e7659b5c3d7cba262d894801b9ec9d00de989e8a82bd4bef91d08da45cdc0"
dependencies = [
"autocfg",
"num-integer",
"num-traits",
]
[[package]] [[package]]
name = "num-integer" name = "num-integer"
version = "0.1.45" version = "0.1.45"
@ -1982,9 +2014,9 @@ dependencies = [
[[package]] [[package]]
name = "num-traits" name = "num-traits"
version = "0.2.15" version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "578ede34cf02f8924ab9447f50c28075b4d3e5b269972345e7e0372b38c6cdcd" checksum = "f30b0abd723be7e2ffca1272140fac1a2f084c77ec3e123c192b66af1ee9e6c2"
dependencies = [ dependencies = [
"autocfg", "autocfg",
] ]
@ -2410,9 +2442,9 @@ dependencies = [
[[package]] [[package]]
name = "phonenumber" name = "phonenumber"
version = "0.3.2+8.13.9" version = "0.3.3+8.13.9"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34749f64ea9d76f10cdc8a859588b57775f59177c7dd91f744d620bd62982d6f" checksum = "635f3e6288e4f01c049d89332a031bd74f25d64b6fb94703ca966e819488cd06"
dependencies = [ dependencies = [
"bincode", "bincode",
"either", "either",
@ -2425,6 +2457,7 @@ dependencies = [
"regex-cache", "regex-cache",
"serde", "serde",
"serde_derive", "serde_derive",
"strum",
"thiserror", "thiserror",
] ]
@ -3021,10 +3054,11 @@ dependencies = [
[[package]] [[package]]
name = "schemars" name = "schemars"
version = "0.8.13" version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "763f8cd0d4c71ed8389c90cb8100cba87e763bd01a8e614d4f0af97bcd50a161" checksum = "1f7b0ce13155372a76ee2e1c5ffba1fe61ede73fbea5630d61eee6fac4929c0c"
dependencies = [ dependencies = [
"bigdecimal",
"bytes", "bytes",
"chrono", "chrono",
"dyn-clone", "dyn-clone",
@ -3037,9 +3071,9 @@ dependencies = [
[[package]] [[package]]
name = "schemars_derive" name = "schemars_derive"
version = "0.8.13" version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec0f696e21e10fa546b7ffb1c9672c6de8fbc7a81acf59524386d8639bf12737" checksum = "e85e2a16b12bdb763244c69ab79363d71db2b4b918a2def53f80b02e0574b13c"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -3458,6 +3492,28 @@ dependencies = [
"syn 2.0.33", "syn 2.0.33",
] ]
[[package]]
name = "strum"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "063e6045c0e62079840579a7e47a355ae92f60eb74daaf156fb1e84ba164e63f"
dependencies = [
"strum_macros",
]
[[package]]
name = "strum_macros"
version = "0.24.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e385be0d24f186b4ce2f9982191e7101bb737312ad61c1f2f984f34bcf85d59"
dependencies = [
"heck 0.4.1",
"proc-macro2",
"quote",
"rustversion",
"syn 1.0.109",
]
[[package]] [[package]]
name = "syn" name = "syn"
version = "1.0.109" version = "1.0.109"
@ -3719,7 +3775,7 @@ dependencies = [
[[package]] [[package]]
name = "tauri-plugin-fs-extra" name = "tauri-plugin-fs-extra"
version = "0.0.0" version = "0.0.0"
source = "git+https://github.com/tauri-apps/plugins-workspace?branch=v1#5b814f56e6368fdec46c4ddb04a07e0923ff995a" source = "git+https://github.com/tauri-apps/plugins-workspace?branch=v1#b04bde3461066c709d6801cf9ca305cf889a8394"
dependencies = [ dependencies = [
"log", "log",
"serde", "serde",

View File

@ -16,7 +16,7 @@ tauri-build = { version = "1.4.0", features = [] }
[dependencies] [dependencies]
anyhow = "1" anyhow = "1"
kittycad = "0.2.25" kittycad = "0.2.26"
oauth2 = "4.4.2" oauth2 = "4.4.2"
serde = { version = "1.0", features = ["derive"] } serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0" serde_json = "1.0"

View File

@ -8,7 +8,7 @@
}, },
"package": { "package": {
"productName": "kittycad-modeling", "productName": "kittycad-modeling",
"version": "0.8.2" "version": "0.9.1"
}, },
"tauri": { "tauri": {
"allowlist": { "allowlist": {

View File

@ -31,6 +31,7 @@ import { TextEditor } from 'components/TextEditor'
import { Themes, getSystemTheme } from 'lib/theme' import { Themes, getSystemTheme } from 'lib/theme'
import { useSetupEngineManager } from 'hooks/useSetupEngineManager' import { useSetupEngineManager } from 'hooks/useSetupEngineManager'
import { useEngineConnectionSubscriptions } from 'hooks/useEngineConnectionSubscriptions' import { useEngineConnectionSubscriptions } from 'hooks/useEngineConnectionSubscriptions'
import { engineCommandManager } from './lang/std/engineConnection'
export function App() { export function App() {
const { code: loadedCode, project } = useLoaderData() as IndexLoaderData const { code: loadedCode, project } = useLoaderData() as IndexLoaderData
@ -39,7 +40,6 @@ export function App() {
useHotKeyListener() useHotKeyListener()
const { const {
setCode, setCode,
engineCommandManager,
buttonDownInStream, buttonDownInStream,
openPanes, openPanes,
setOpenPanes, setOpenPanes,
@ -52,7 +52,6 @@ export function App() {
guiMode: s.guiMode, guiMode: s.guiMode,
setGuiMode: s.setGuiMode, setGuiMode: s.setGuiMode,
setCode: s.setCode, setCode: s.setCode,
engineCommandManager: s.engineCommandManager,
buttonDownInStream: s.buttonDownInStream, buttonDownInStream: s.buttonDownInStream,
openPanes: s.openPanes, openPanes: s.openPanes,
setOpenPanes: s.setOpenPanes, setOpenPanes: s.setOpenPanes,
@ -91,12 +90,12 @@ export function App() {
if (guiMode.sketchMode === 'sketchEdit') { if (guiMode.sketchMode === 'sketchEdit') {
// TODO: share this with Toolbar's "Exit sketch" button // TODO: share this with Toolbar's "Exit sketch" button
// exiting sketch should be done consistently across all exits // exiting sketch should be done consistently across all exits
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { type: 'edit_mode_exit' }, cmd: { type: 'edit_mode_exit' },
}) })
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { type: 'default_camera_disable_sketch_mode' }, cmd: { type: 'default_camera_disable_sketch_mode' },
@ -107,7 +106,7 @@ export function App() {
// when exiting sketch mode in the future // when exiting sketch mode in the future
executeAst() executeAst()
} else { } else {
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { cmd: {
@ -156,7 +155,7 @@ export function App() {
useEngineConnectionSubscriptions() useEngineConnectionSubscriptions()
const debounceSocketSend = throttle<EngineCommand>((message) => { const debounceSocketSend = throttle<EngineCommand>((message) => {
engineCommandManager?.sendSceneCommand(message) engineCommandManager.sendSceneCommand(message)
}, 16) }, 16)
const handleMouseMove: MouseEventHandler<HTMLDivElement> = (e) => { const handleMouseMove: MouseEventHandler<HTMLDivElement> = (e) => {
e.nativeEvent.preventDefault() e.nativeEvent.preventDefault()
@ -216,7 +215,6 @@ export function App() {
} else if (interactionGuards.zoom.dragCallback(eWithButton)) { } else if (interactionGuards.zoom.dragCallback(eWithButton)) {
interaction = 'zoom' interaction = 'zoom'
} else { } else {
console.log('none')
return return
} }

View File

@ -18,6 +18,7 @@ import styles from './Toolbar.module.css'
import { v4 as uuidv4 } from 'uuid' import { v4 as uuidv4 } from 'uuid'
import { useAppMode } from 'hooks/useAppMode' import { useAppMode } from 'hooks/useAppMode'
import { ActionIcon } from 'components/ActionIcon' import { ActionIcon } from 'components/ActionIcon'
import { engineCommandManager } from './lang/std/engineConnection'
export const sketchButtonClassnames = { export const sketchButtonClassnames = {
background: background:
@ -50,7 +51,6 @@ export const Toolbar = () => {
ast, ast,
updateAst, updateAst,
programMemory, programMemory,
engineCommandManager,
executeAst, executeAst,
} = useStore((s) => ({ } = useStore((s) => ({
guiMode: s.guiMode, guiMode: s.guiMode,
@ -59,15 +59,10 @@ export const Toolbar = () => {
ast: s.ast, ast: s.ast,
updateAst: s.updateAst, updateAst: s.updateAst,
programMemory: s.programMemory, programMemory: s.programMemory,
engineCommandManager: s.engineCommandManager,
executeAst: s.executeAst, executeAst: s.executeAst,
})) }))
useAppMode() useAppMode()
useEffect(() => {
console.log('guiMode', guiMode)
}, [guiMode])
function ToolbarButtons({ className }: React.HTMLAttributes<HTMLElement>) { function ToolbarButtons({ className }: React.HTMLAttributes<HTMLElement>) {
return ( return (
<span className={styles.toolbarButtons + ' ' + className}> <span className={styles.toolbarButtons + ' ' + className}>
@ -173,12 +168,12 @@ export const Toolbar = () => {
{guiMode.mode === 'sketch' && ( {guiMode.mode === 'sketch' && (
<button <button
onClick={() => { onClick={() => {
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { type: 'edit_mode_exit' }, cmd: { type: 'edit_mode_exit' },
}) })
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { type: 'default_camera_disable_sketch_mode' }, cmd: { type: 'default_camera_disable_sketch_mode' },
@ -214,7 +209,7 @@ export const Toolbar = () => {
<button <button
key={sketchFnName} key={sketchFnName}
onClick={() => { onClick={() => {
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { cmd: {

View File

@ -10,6 +10,7 @@ import {
} from '../lang/modifyAst' } from '../lang/modifyAst'
import { findAllPreviousVariables, PrevVariable } from '../lang/queryAst' import { findAllPreviousVariables, PrevVariable } from '../lang/queryAst'
import { useStore } from '../useStore' import { useStore } from '../useStore'
import { engineCommandManager } from '../lang/std/engineConnection'
export const AvailableVars = ({ export const AvailableVars = ({
onVarClick, onVarClick,
@ -92,14 +93,11 @@ export function useCalc({
newVariableInsertIndex: number newVariableInsertIndex: number
setNewVariableName: (a: string) => void setNewVariableName: (a: string) => void
} { } {
const { ast, programMemory, selectionRange, engineCommandManager } = useStore( const { ast, programMemory, selectionRange } = useStore((s) => ({
(s) => ({ ast: s.ast,
ast: s.ast, programMemory: s.programMemory,
programMemory: s.programMemory, selectionRange: s.selectionRanges.codeBasedSelections[0].range,
selectionRange: s.selectionRanges.codeBasedSelections[0].range, }))
engineCommandManager: s.engineCommandManager,
})
)
const inputRef = useRef<HTMLInputElement>(null) const inputRef = useRef<HTMLInputElement>(null)
const [availableVarInfo, setAvailableVarInfo] = useState< const [availableVarInfo, setAvailableVarInfo] = useState<
ReturnType<typeof findAllPreviousVariables> ReturnType<typeof findAllPreviousVariables>
@ -140,7 +138,6 @@ export function useCalc({
}, [ast, programMemory, selectionRange]) }, [ast, programMemory, selectionRange])
useEffect(() => { useEffect(() => {
if (!engineCommandManager) return
try { try {
const code = `const __result__ = ${value}\nshow(__result__)` const code = `const __result__ = ${value}\nshow(__result__)`
const ast = parser_wasm(code) const ast = parser_wasm(code)

View File

@ -1,5 +1,4 @@
import { CollapsiblePanel, CollapsiblePanelProps } from './CollapsiblePanel' import { CollapsiblePanel, CollapsiblePanelProps } from './CollapsiblePanel'
import { useStore } from '../useStore'
import { v4 as uuidv4 } from 'uuid' import { v4 as uuidv4 } from 'uuid'
import { EngineCommand } from '../lang/std/engineConnection' import { EngineCommand } from '../lang/std/engineConnection'
import { useState } from 'react' import { useState } from 'react'
@ -7,6 +6,7 @@ import { ActionButton } from '../components/ActionButton'
import { faCheck } from '@fortawesome/free-solid-svg-icons' import { faCheck } from '@fortawesome/free-solid-svg-icons'
import { isReducedMotion } from 'lang/util' import { isReducedMotion } from 'lang/util'
import { AstExplorer } from './AstExplorer' import { AstExplorer } from './AstExplorer'
import { engineCommandManager } from '../lang/std/engineConnection'
type SketchModeCmd = Extract< type SketchModeCmd = Extract<
Extract<EngineCommand, { type: 'modeling_cmd_req' }>['cmd'], Extract<EngineCommand, { type: 'modeling_cmd_req' }>['cmd'],
@ -14,9 +14,6 @@ type SketchModeCmd = Extract<
> >
export const DebugPanel = ({ className, ...props }: CollapsiblePanelProps) => { export const DebugPanel = ({ className, ...props }: CollapsiblePanelProps) => {
const { engineCommandManager } = useStore((s) => ({
engineCommandManager: s.engineCommandManager,
}))
const [sketchModeCmd, setSketchModeCmd] = useState<SketchModeCmd>({ const [sketchModeCmd, setSketchModeCmd] = useState<SketchModeCmd>({
type: 'default_camera_enable_sketch_mode', type: 'default_camera_enable_sketch_mode',
origin: { x: 0, y: 0, z: 0 }, origin: { x: 0, y: 0, z: 0 },
@ -70,19 +67,18 @@ export const DebugPanel = ({ className, ...props }: CollapsiblePanelProps) => {
className="w-16" className="w-16"
type="checkbox" type="checkbox"
checked={sketchModeCmd.ortho} checked={sketchModeCmd.ortho}
onChange={(a) => { onChange={(a) =>
console.log(a, (a as any).checked)
setSketchModeCmd({ setSketchModeCmd({
...sketchModeCmd, ...sketchModeCmd,
ortho: a.target.checked, ortho: a.target.checked,
}) })
}} }
/> />
</div> </div>
<ActionButton <ActionButton
Element="button" Element="button"
onClick={() => { onClick={() => {
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd: sketchModeCmd, cmd: sketchModeCmd,
cmd_id: uuidv4(), cmd_id: uuidv4(),

View File

@ -1,11 +1,11 @@
import { v4 as uuidv4 } from 'uuid' import { v4 as uuidv4 } from 'uuid'
import { useStore } from '../useStore'
import { faFileExport, faXmark } from '@fortawesome/free-solid-svg-icons' import { faFileExport, faXmark } from '@fortawesome/free-solid-svg-icons'
import { ActionButton } from './ActionButton' import { ActionButton } from './ActionButton'
import Modal from 'react-modal' import Modal from 'react-modal'
import React from 'react' import React from 'react'
import { useFormik } from 'formik' import { useFormik } from 'formik'
import { Models } from '@kittycad/lib' import { Models } from '@kittycad/lib'
import { engineCommandManager } from '../lang/std/engineConnection'
type OutputFormat = Models['OutputFormat_type'] type OutputFormat = Models['OutputFormat_type']
@ -18,10 +18,6 @@ interface ExportButtonProps extends React.PropsWithChildren {
} }
export const ExportButton = ({ children, className }: ExportButtonProps) => { export const ExportButton = ({ children, className }: ExportButtonProps) => {
const { engineCommandManager } = useStore((s) => ({
engineCommandManager: s.engineCommandManager,
}))
const [modalIsOpen, setIsOpen] = React.useState(false) const [modalIsOpen, setIsOpen] = React.useState(false)
const defaultType = 'gltf' const defaultType = 'gltf'
@ -66,7 +62,7 @@ export const ExportButton = ({ children, className }: ExportButtonProps) => {
}, },
} }
} }
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd: { cmd: {
type: 'export', type: 'export',

View File

@ -50,7 +50,7 @@ export const MemoryPanel = ({
export const processMemory = (programMemory: ProgramMemory) => { export const processMemory = (programMemory: ProgramMemory) => {
const processedMemory: any = {} const processedMemory: any = {}
Object.keys(programMemory.root).forEach((key) => { Object.keys(programMemory?.root || {}).forEach((key) => {
const val = programMemory.root[key] const val = programMemory.root[key]
if (typeof val.value !== 'function') { if (typeof val.value !== 'function') {
if (val.type === 'SketchGroup') { if (val.type === 'SketchGroup') {

View File

@ -25,6 +25,7 @@ import { modify_ast_for_sketch } from '../wasm-lib/pkg/wasm_lib'
import { KCLError } from 'lang/errors' import { KCLError } from 'lang/errors'
import { KclError as RustKclError } from '../wasm-lib/kcl/bindings/KclError' import { KclError as RustKclError } from '../wasm-lib/kcl/bindings/KclError'
import { rangeTypeFix } from 'lang/abstractSyntaxTree' import { rangeTypeFix } from 'lang/abstractSyntaxTree'
import { engineCommandManager } from '../lang/std/engineConnection'
export const Stream = ({ className = '' }) => { export const Stream = ({ className = '' }) => {
const [isLoading, setIsLoading] = useState(true) const [isLoading, setIsLoading] = useState(true)
@ -32,7 +33,6 @@ export const Stream = ({ className = '' }) => {
const videoRef = useRef<HTMLVideoElement>(null) const videoRef = useRef<HTMLVideoElement>(null)
const { const {
mediaStream, mediaStream,
engineCommandManager,
setButtonDownInStream, setButtonDownInStream,
didDragInStream, didDragInStream,
setDidDragInStream, setDidDragInStream,
@ -45,7 +45,6 @@ export const Stream = ({ className = '' }) => {
programMemory, programMemory,
} = useStore((s) => ({ } = useStore((s) => ({
mediaStream: s.mediaStream, mediaStream: s.mediaStream,
engineCommandManager: s.engineCommandManager,
setButtonDownInStream: s.setButtonDownInStream, setButtonDownInStream: s.setButtonDownInStream,
fileId: s.fileId, fileId: s.fileId,
didDragInStream: s.didDragInStream, didDragInStream: s.didDragInStream,
@ -73,7 +72,7 @@ export const Stream = ({ className = '' }) => {
if (!videoRef.current) return if (!videoRef.current) return
if (!mediaStream) return if (!mediaStream) return
videoRef.current.srcObject = mediaStream videoRef.current.srcObject = mediaStream
}, [mediaStream, engineCommandManager]) }, [mediaStream])
const handleMouseDown: MouseEventHandler<HTMLVideoElement> = (e) => { const handleMouseDown: MouseEventHandler<HTMLVideoElement> = (e) => {
if (!videoRef.current) return if (!videoRef.current) return
@ -107,7 +106,7 @@ export const Stream = ({ className = '' }) => {
} }
if (guiMode.mode === 'sketch' && guiMode.sketchMode === ('move' as any)) { if (guiMode.mode === 'sketch' && guiMode.sketchMode === ('move' as any)) {
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd: { cmd: {
type: 'handle_mouse_drag_start', type: 'handle_mouse_drag_start',
@ -121,7 +120,7 @@ export const Stream = ({ className = '' }) => {
guiMode.sketchMode === ('sketch_line' as any) guiMode.sketchMode === ('sketch_line' as any)
) )
) { ) {
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd: { cmd: {
type: 'camera_drag_start', type: 'camera_drag_start',
@ -139,7 +138,7 @@ export const Stream = ({ className = '' }) => {
const handleScroll: WheelEventHandler<HTMLVideoElement> = (e) => { const handleScroll: WheelEventHandler<HTMLVideoElement> = (e) => {
if (!cameraMouseDragGuards[cameraControls].zoom.scrollCallback(e)) return if (!cameraMouseDragGuards[cameraControls].zoom.scrollCallback(e)) return
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd: { cmd: {
type: 'default_camera_zoom', type: 'default_camera_zoom',
@ -177,7 +176,7 @@ export const Stream = ({ className = '' }) => {
} }
if (!didDragInStream) { if (!didDragInStream) {
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd: { cmd: {
type: 'select_with_point', type: 'select_with_point',
@ -214,7 +213,7 @@ export const Stream = ({ className = '' }) => {
window: { x, y }, window: { x, y },
} }
} }
engineCommandManager?.sendSceneCommand(command).then(async (resp) => { engineCommandManager.sendSceneCommand(command).then(async (resp) => {
if (!(guiMode.mode === 'sketch')) return if (!(guiMode.mode === 'sketch')) return
if (guiMode.sketchMode === 'selectFace') return if (guiMode.sketchMode === 'selectFace') return
@ -240,9 +239,6 @@ export const Stream = ({ className = '' }) => {
) { ) {
// Let's get the updated ast. // Let's get the updated ast.
if (sketchGroupId === '') return if (sketchGroupId === '') return
console.log('guiMode.pathId', guiMode.pathId)
// We have a problem if we do not have an id for the sketch group. // We have a problem if we do not have an id for the sketch group.
if ( if (
guiMode.pathId === undefined || guiMode.pathId === undefined ||
@ -285,7 +281,7 @@ export const Stream = ({ className = '' }) => {
guiMode.waitingFirstClick && guiMode.waitingFirstClick &&
!isEditingExistingSketch !isEditingExistingSketch
) { ) {
const curve = await engineCommandManager?.sendSceneCommand({ const curve = await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { cmd: {
@ -326,7 +322,7 @@ export const Stream = ({ className = '' }) => {
resp?.data?.data?.entities_modified?.length && resp?.data?.data?.entities_modified?.length &&
(!guiMode.waitingFirstClick || isEditingExistingSketch) (!guiMode.waitingFirstClick || isEditingExistingSketch)
) { ) {
const curve = await engineCommandManager?.sendSceneCommand({ const curve = await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { cmd: {
@ -371,12 +367,12 @@ export const Stream = ({ className = '' }) => {
setGuiMode({ setGuiMode({
mode: 'default', mode: 'default',
}) })
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { type: 'edit_mode_exit' }, cmd: { type: 'edit_mode_exit' },
}) })
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { type: 'default_camera_disable_sketch_mode' }, cmd: { type: 'default_camera_disable_sketch_mode' },

View File

@ -30,6 +30,7 @@ import { isOverlap, roundOff } from 'lib/utils'
import { kclErrToDiagnostic } from 'lang/errors' import { kclErrToDiagnostic } from 'lang/errors'
import { CSSRuleObject } from 'tailwindcss/types/config' import { CSSRuleObject } from 'tailwindcss/types/config'
import interact from '@replit/codemirror-interact' import interact from '@replit/codemirror-interact'
import { engineCommandManager } from '../lang/std/engineConnection'
export const editorShortcutMeta = { export const editorShortcutMeta = {
formatCode: { formatCode: {
@ -52,7 +53,6 @@ export const TextEditor = ({
code, code,
deferredSetCode, deferredSetCode,
editorView, editorView,
engineCommandManager,
formatCode, formatCode,
isLSPServerReady, isLSPServerReady,
selectionRanges, selectionRanges,
@ -64,7 +64,6 @@ export const TextEditor = ({
code: s.code, code: s.code,
deferredSetCode: s.deferredSetCode, deferredSetCode: s.deferredSetCode,
editorView: s.editorView, editorView: s.editorView,
engineCommandManager: s.engineCommandManager,
formatCode: s.formatCode, formatCode: s.formatCode,
isLSPServerReady: s.isLSPServerReady, isLSPServerReady: s.isLSPServerReady,
selectionRanges: s.selectionRanges, selectionRanges: s.selectionRanges,
@ -173,7 +172,7 @@ export const TextEditor = ({
const idBasedSelections = codeBasedSelections const idBasedSelections = codeBasedSelections
.map(({ type, range }) => { .map(({ type, range }) => {
const hasOverlap = Object.entries( const hasOverlap = Object.entries(
engineCommandManager?.sourceRangeMap || {} engineCommandManager.sourceRangeMap || {}
).filter(([_, sourceRange]) => { ).filter(([_, sourceRange]) => {
return isOverlap(sourceRange, range) return isOverlap(sourceRange, range)
}) })
@ -186,7 +185,7 @@ export const TextEditor = ({
}) })
.filter(Boolean) as any .filter(Boolean) as any
engineCommandManager?.cusorsSelected({ engineCommandManager.cusorsSelected({
otherSelections: [], otherSelections: [],
idBasedSelections, idBasedSelections,
}) })

View File

@ -133,7 +133,7 @@ export const SetAbsDistance = ({ buttonType }: { buttonType: ButtonType }) => {
callBack: updateCursors(setCursor, selectionRanges, pathToNodeMap), callBack: updateCursors(setCursor, selectionRanges, pathToNodeMap),
}) })
} catch (e) { } catch (e) {
console.log('e', e) console.log('error', e)
} }
}} }}
disabled={!enableAngLen} disabled={!enableAngLen}

View File

@ -147,7 +147,7 @@ export const SetAngleLength = ({
callBack: updateCursors(setCursor, selectionRanges, pathToNodeMap), callBack: updateCursors(setCursor, selectionRanges, pathToNodeMap),
}) })
} catch (e) { } catch (e) {
console.log('e', e) console.log('erorr', e)
} }
}} }}
disabled={!enableAngLen} disabled={!enableAngLen}

View File

@ -109,7 +109,6 @@ export default class Client extends jsrpc.JSONRPCServerAndClient {
} }
} }
messageString += message messageString += message
// console.log(messageString)
return return
}) })

View File

@ -13,6 +13,7 @@ import {
CompletionItemKind, CompletionItemKind,
CompletionTriggerKind, CompletionTriggerKind,
} from 'vscode-languageserver-protocol' } from 'vscode-languageserver-protocol'
import debounce from 'debounce-promise'
import type { import type {
Completion, Completion,
@ -53,14 +54,11 @@ export class LanguageServerPlugin implements PluginValue {
private languageId: string private languageId: string
private documentVersion: number private documentVersion: number
private changesTimeout: number
constructor(private view: EditorView, private allowHTMLContent: boolean) { constructor(private view: EditorView, private allowHTMLContent: boolean) {
this.client = this.view.state.facet(client) this.client = this.view.state.facet(client)
this.documentUri = this.view.state.facet(documentUri) this.documentUri = this.view.state.facet(documentUri)
this.languageId = this.view.state.facet(languageId) this.languageId = this.view.state.facet(languageId)
this.documentVersion = 0 this.documentVersion = 0
this.changesTimeout = 0
this.client.attachPlugin(this) this.client.attachPlugin(this)
@ -71,12 +69,10 @@ export class LanguageServerPlugin implements PluginValue {
update({ docChanged }: ViewUpdate) { update({ docChanged }: ViewUpdate) {
if (!docChanged) return if (!docChanged) return
if (this.changesTimeout) clearTimeout(this.changesTimeout)
this.changesTimeout = window.setTimeout(() => { this.sendChange({
this.sendChange({ documentText: this.view.state.doc.toString(),
documentText: this.view.state.doc.toString(), })
})
}, changesDelay)
} }
destroy() { destroy() {
@ -99,14 +95,32 @@ export class LanguageServerPlugin implements PluginValue {
async sendChange({ documentText }: { documentText: string }) { async sendChange({ documentText }: { documentText: string }) {
if (!this.client.ready) return if (!this.client.ready) return
if (documentText.length > 5000) {
// Clear out the text it thinks we have, large documents will throw a stack error.
// This is obviously not a good fix but it works for now til we figure
// out the stack limits in wasm and also rewrite the parser.
// Since this is only for hover and completions it will be fine,
// completions will still work for stdlib but hover will not.
// That seems like a fine trade-off for a working editor for the time
// being.
documentText = ''
}
try { try {
await this.client.textDocumentDidChange({ debounce(
textDocument: { () => {
uri: this.documentUri, return this.client.textDocumentDidChange({
version: this.documentVersion++, textDocument: {
uri: this.documentUri,
version: this.documentVersion++,
},
contentChanges: [{ text: documentText }],
})
}, },
contentChanges: [{ text: documentText }], changesDelay,
}) { leading: true }
)
} catch (e) { } catch (e) {
console.error(e) console.error(e)
} }

View File

@ -8,6 +8,7 @@ import { ArtifactMap, EngineCommandManager } from 'lang/std/engineConnection'
import { Models } from '@kittycad/lib/dist/types/src' import { Models } from '@kittycad/lib/dist/types/src'
import { isReducedMotion } from 'lang/util' import { isReducedMotion } from 'lang/util'
import { isOverlap } from 'lib/utils' import { isOverlap } from 'lib/utils'
import { engineCommandManager } from '../lang/std/engineConnection'
interface DefaultPlanes { interface DefaultPlanes {
xy: string xy: string
@ -17,19 +18,13 @@ interface DefaultPlanes {
} }
export function useAppMode() { export function useAppMode() {
const { const { guiMode, setGuiMode, selectionRanges, selectionRangeTypeMap } =
guiMode, useStore((s) => ({
setGuiMode, guiMode: s.guiMode,
selectionRanges, setGuiMode: s.setGuiMode,
engineCommandManager, selectionRanges: s.selectionRanges,
selectionRangeTypeMap, selectionRangeTypeMap: s.selectionRangeTypeMap,
} = useStore((s) => ({ }))
guiMode: s.guiMode,
setGuiMode: s.setGuiMode,
selectionRanges: s.selectionRanges,
engineCommandManager: s.engineCommandManager,
selectionRangeTypeMap: s.selectionRangeTypeMap,
}))
const [defaultPlanes, setDefaultPlanes] = useState<DefaultPlanes | null>(null) const [defaultPlanes, setDefaultPlanes] = useState<DefaultPlanes | null>(null)
useEffect(() => { useEffect(() => {
if ( if (
@ -65,7 +60,7 @@ export function useAppMode() {
setDefaultPlanesHidden(engineCommandManager, localDefaultPlanes, true) setDefaultPlanesHidden(engineCommandManager, localDefaultPlanes, true)
// TODO figure out the plane to use based on the sketch // TODO figure out the plane to use based on the sketch
// maybe it's easier to make a new plane than rely on the defaults // maybe it's easier to make a new plane than rely on the defaults
await engineCommandManager?.sendSceneCommand({ await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { cmd: {
@ -135,7 +130,7 @@ export function useAppMode() {
]) ])
useEffect(() => { useEffect(() => {
const unSub = engineCommandManager?.subscribeTo({ const unSub = engineCommandManager.subscribeTo({
event: 'select_with_point', event: 'select_with_point',
callback: async ({ data }) => { callback: async ({ data }) => {
if (!data.entity_id) return if (!data.entity_id) return
@ -144,18 +139,16 @@ export function useAppMode() {
// user clicked something else in the scene // user clicked something else in the scene
return return
} }
const sketchModeResponse = await engineCommandManager?.sendSceneCommand( const sketchModeResponse = await engineCommandManager.sendSceneCommand({
{ type: 'modeling_cmd_req',
type: 'modeling_cmd_req', cmd_id: uuidv4(),
cmd_id: uuidv4(), cmd: {
cmd: { type: 'sketch_mode_enable',
type: 'sketch_mode_enable', plane_id: data.entity_id,
plane_id: data.entity_id, ortho: true,
ortho: true, animated: !isReducedMotion(),
animated: !isReducedMotion(), },
}, })
}
)
setDefaultPlanesHidden(engineCommandManager, defaultPlanes, true) setDefaultPlanesHidden(engineCommandManager, defaultPlanes, true)
const sketchUuid = uuidv4() const sketchUuid = uuidv4()
const proms: any[] = [] const proms: any[] = []
@ -178,8 +171,7 @@ export function useAppMode() {
}, },
}) })
) )
const res = await Promise.all(proms) await Promise.all(proms)
console.log('res', res)
setGuiMode({ setGuiMode({
mode: 'sketch', mode: 'sketch',
sketchMode: 'sketchEdit', sketchMode: 'sketchEdit',
@ -209,7 +201,7 @@ async function createPlane(
} }
) { ) {
const planeId = uuidv4() const planeId = uuidv4()
await engineCommandManager?.sendSceneCommand({ await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd: { cmd: {
type: 'make_plane', type: 'make_plane',
@ -221,7 +213,7 @@ async function createPlane(
}, },
cmd_id: planeId, cmd_id: planeId,
}) })
await engineCommandManager?.sendSceneCommand({ await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd: { cmd: {
type: 'plane_set_color', type: 'plane_set_color',
@ -234,12 +226,12 @@ async function createPlane(
} }
function setDefaultPlanesHidden( function setDefaultPlanesHidden(
engineCommandManager: EngineCommandManager | undefined, engineCommandManager: EngineCommandManager,
defaultPlanes: DefaultPlanes, defaultPlanes: DefaultPlanes,
hidden: boolean hidden: boolean
) { ) {
Object.values(defaultPlanes).forEach((planeId) => { Object.values(defaultPlanes).forEach((planeId) => {
engineCommandManager?.sendSceneCommand({ engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req', type: 'modeling_cmd_req',
cmd_id: uuidv4(), cmd_id: uuidv4(),
cmd: { cmd: {

View File

@ -1,14 +1,9 @@
import { useEffect } from 'react' import { useEffect } from 'react'
import { useStore } from 'useStore' import { useStore } from 'useStore'
import { engineCommandManager } from '../lang/std/engineConnection'
export function useEngineConnectionSubscriptions() { export function useEngineConnectionSubscriptions() {
const { const { setCursor2, setHighlightRange, highlightRange } = useStore((s) => ({
engineCommandManager,
setCursor2,
setHighlightRange,
highlightRange,
} = useStore((s) => ({
engineCommandManager: s.engineCommandManager,
setCursor2: s.setCursor2, setCursor2: s.setCursor2,
setHighlightRange: s.setHighlightRange, setHighlightRange: s.setHighlightRange,
highlightRange: s.highlightRange, highlightRange: s.highlightRange,

View File

@ -1,53 +1,90 @@
import { useLayoutEffect } from 'react' import { useLayoutEffect, useEffect, useRef } from 'react'
import { _executor } from '../lang/executor' import { _executor } from '../lang/executor'
import { useStore } from '../useStore' import { useStore } from '../useStore'
import { EngineCommandManager } from '../lang/std/engineConnection' import { engineCommandManager } from '../lang/std/engineConnection'
import { deferExecution } from 'lib/utils'
export function useSetupEngineManager( export function useSetupEngineManager(
streamRef: React.RefObject<HTMLDivElement>, streamRef: React.RefObject<HTMLDivElement>,
token?: string token?: string
) { ) {
const { const {
setEngineCommandManager,
setMediaStream, setMediaStream,
setIsStreamReady, setIsStreamReady,
setStreamDimensions, setStreamDimensions,
executeCode, executeCode,
streamDimensions,
} = useStore((s) => ({ } = useStore((s) => ({
setEngineCommandManager: s.setEngineCommandManager,
setMediaStream: s.setMediaStream, setMediaStream: s.setMediaStream,
setIsStreamReady: s.setIsStreamReady, setIsStreamReady: s.setIsStreamReady,
setStreamDimensions: s.setStreamDimensions, setStreamDimensions: s.setStreamDimensions,
executeCode: s.executeCode, executeCode: s.executeCode,
streamDimensions: s.streamDimensions,
})) }))
const streamWidth = streamRef?.current?.offsetWidth const streamWidth = streamRef?.current?.offsetWidth
const streamHeight = streamRef?.current?.offsetHeight const streamHeight = streamRef?.current?.offsetHeight
const hasSetNonZeroDimensions = useRef<boolean>(false)
useLayoutEffect(() => {
// Load the engine command manager once with the initial width and height,
// then we do not want to reload it.
const { width: quadWidth, height: quadHeight } = getDimensions(
streamWidth,
streamHeight
)
if (!hasSetNonZeroDimensions.current && quadHeight && quadWidth) {
engineCommandManager.start({
setMediaStream,
setIsStreamReady,
width: quadWidth,
height: quadHeight,
token,
})
engineCommandManager.waitForReady.then(() => {
executeCode()
})
setStreamDimensions({
streamWidth: quadWidth,
streamHeight: quadHeight,
})
hasSetNonZeroDimensions.current = true
}
}, [streamRef?.current?.offsetWidth, streamRef?.current?.offsetHeight])
useEffect(() => {
const handleResize = deferExecution(() => {
const { width, height } = getDimensions(
streamRef?.current?.offsetWidth,
streamRef?.current?.offsetHeight
)
if (
streamDimensions.streamWidth !== width ||
streamDimensions.streamHeight !== height
) {
engineCommandManager.handleResize({
streamWidth: width,
streamHeight: height,
})
setStreamDimensions({
streamWidth: width,
streamHeight: height,
})
}
}, 500)
window.addEventListener('resize', handleResize)
return () => {
window.removeEventListener('resize', handleResize)
}
}, [])
}
function getDimensions(streamWidth?: number, streamHeight?: number) {
const width = streamWidth ? streamWidth : 0 const width = streamWidth ? streamWidth : 0
const quadWidth = Math.round(width / 4) * 4 const quadWidth = Math.round(width / 4) * 4
const height = streamHeight ? streamHeight : 0 const height = streamHeight ? streamHeight : 0
const quadHeight = Math.round(height / 4) * 4 const quadHeight = Math.round(height / 4) * 4
return { width: quadWidth, height: quadHeight }
useLayoutEffect(() => {
setStreamDimensions({
streamWidth: quadWidth,
streamHeight: quadHeight,
})
if (!width || !height) return
const eng = new EngineCommandManager({
setMediaStream,
setIsStreamReady,
width: quadWidth,
height: quadHeight,
token,
})
setEngineCommandManager(eng)
eng.waitForReady.then(() => {
executeCode()
})
return () => {
eng?.tearDown()
}
}, [quadWidth, quadHeight])
} }

View File

@ -48,7 +48,7 @@ export function useConvertToVariable() {
updateAst(_modifiedAst, true) updateAst(_modifiedAst, true)
} catch (e) { } catch (e) {
console.log('e', e) console.log('error', e)
} }
} }

View File

@ -1691,7 +1691,6 @@ describe('parsing errors', () => {
let _theError let _theError
try { try {
const result = expect(parser_wasm(code)) const result = expect(parser_wasm(code))
console.log('result', result)
} catch (e) { } catch (e) {
_theError = e _theError = e
} }

View File

@ -1,6 +1,7 @@
import { getNodePathFromSourceRange, getNodeFromPath } from './queryAst' import { getNodePathFromSourceRange, getNodeFromPath } from './queryAst'
import { parser_wasm } from './abstractSyntaxTree' import { parser_wasm } from './abstractSyntaxTree'
import { initPromise } from './rust' import { initPromise } from './rust'
import { Identifier } from './abstractSyntaxTreeTypes'
beforeAll(() => initPromise) beforeAll(() => initPromise)
@ -27,4 +28,81 @@ const sk3 = startSketchAt([0, 0])
expect([node.start, node.end]).toEqual(sourceRange) expect([node.start, node.end]).toEqual(sourceRange)
expect(node.type).toBe('CallExpression') expect(node.type).toBe('CallExpression')
}) })
it('gets path right for function definition params', () => {
const code = `fn cube = (pos, scale) => {
const sg = startSketchAt(pos)
|> line([0, scale], %)
|> line([scale, 0], %)
|> line([0, -scale], %)
return sg
}
const b1 = cube([0,0], 10)`
const subStr = 'pos, scale'
const subStrIndex = code.indexOf(subStr)
const sourceRange: [number, number] = [
subStrIndex,
subStrIndex + 'pos'.length,
]
const ast = parser_wasm(code)
const nodePath = getNodePathFromSourceRange(ast, sourceRange)
const node = getNodeFromPath<Identifier>(ast, nodePath).node
expect(nodePath).toEqual([
['body', ''],
[0, 'index'],
['declarations', 'VariableDeclaration'],
[0, 'index'],
['init', ''],
['params', 'FunctionExpression'],
[0, 'index'],
])
expect(node.type).toBe('Identifier')
expect(node.name).toBe('pos')
})
it('gets path right for deep within function definition body', () => {
const code = `fn cube = (pos, scale) => {
const sg = startSketchAt(pos)
|> line([0, scale], %)
|> line([scale, 0], %)
|> line([0, -scale], %)
return sg
}
const b1 = cube([0,0], 10)`
const subStr = 'scale, 0'
const subStrIndex = code.indexOf(subStr)
const sourceRange: [number, number] = [
subStrIndex,
subStrIndex + 'scale'.length,
]
const ast = parser_wasm(code)
const nodePath = getNodePathFromSourceRange(ast, sourceRange)
const node = getNodeFromPath<Identifier>(ast, nodePath).node
expect(nodePath).toEqual([
['body', ''],
[0, 'index'],
['declarations', 'VariableDeclaration'],
[0, 'index'],
['init', ''],
['body', 'FunctionExpression'],
['body', 'FunctionExpression'],
[0, 'index'],
['declarations', 'VariableDeclaration'],
[0, 'index'],
['init', ''],
['body', 'PipeExpression'],
[2, 'index'],
['arguments', 'CallExpression'],
[0, 'index'],
['elements', 'ArrayExpression'],
[0, 'index'],
])
expect(node.type).toBe('Identifier')
expect(node.name).toBe('scale')
})
}) })

View File

@ -321,7 +321,7 @@ export function extrudeSketch(
[0, 'index'], [0, 'index'],
] ]
return { return {
modifiedAst: addToShow(_node, name), modifiedAst: node,
pathToNode: [...pathToNode.slice(0, -1), [showCallIndex, 'index']], pathToNode: [...pathToNode.slice(0, -1), [showCallIndex, 'index']],
pathToExtrudeArg, pathToExtrudeArg,
} }

View File

@ -239,7 +239,29 @@ function moreNodePathFromSourceRange(
} }
return path return path
} }
console.error('not implemented') if (_node.type === 'FunctionExpression' && isInRange) {
for (let i = 0; i < _node.params.length; i++) {
const param = _node.params[i]
if (param.start <= start && param.end >= end) {
path.push(['params', 'FunctionExpression'])
path.push([i, 'index'])
return moreNodePathFromSourceRange(param, sourceRange, path)
}
}
if (_node.body.start <= start && _node.body.end >= end) {
path.push(['body', 'FunctionExpression'])
const fnBody = _node.body.body
for (let i = 0; i < fnBody.length; i++) {
const statement = fnBody[i]
if (statement.start <= start && statement.end >= end) {
path.push(['body', 'FunctionExpression'])
path.push([i, 'index'])
return moreNodePathFromSourceRange(statement, sourceRange, path)
}
}
}
}
console.error('not implemented: ' + node.type)
return path return path
} }

View File

@ -7,7 +7,7 @@ export const recast = (ast: Program): string => {
return s return s
} catch (e) { } catch (e) {
// TODO: do something real with the error. // TODO: do something real with the error.
console.log('recast', e) console.log('recast error', e)
throw e throw e
} }
} }

View File

@ -23,6 +23,10 @@ interface ResultCommand extends CommandInfo {
data: any data: any
raw: WebSocketResponse raw: WebSocketResponse
} }
interface FailedCommand extends CommandInfo {
type: 'failed'
errors: Models['FailureWebSocketResponse_type']['errors']
}
interface PendingCommand extends CommandInfo { interface PendingCommand extends CommandInfo {
type: 'pending' type: 'pending'
promise: Promise<any> promise: Promise<any>
@ -30,7 +34,7 @@ interface PendingCommand extends CommandInfo {
} }
export interface ArtifactMap { export interface ArtifactMap {
[key: string]: ResultCommand | PendingCommand [key: string]: ResultCommand | PendingCommand | FailedCommand
} }
export interface SourceRangeMap { export interface SourceRangeMap {
[key: string]: SourceRange [key: string]: SourceRange
@ -41,6 +45,11 @@ interface NewTrackArgs {
mediaStream: MediaStream mediaStream: MediaStream
} }
// This looks funny, I know. This is needed because node and the browser
// disagree as to the type. In a browser it's a number, but in node it's a
// "Timeout".
type Timeout = ReturnType<typeof setTimeout>
type ClientMetrics = Models['ClientMetrics_type'] type ClientMetrics = Models['ClientMetrics_type']
// EngineConnection encapsulates the connection(s) to the Engine // EngineConnection encapsulates the connection(s) to the Engine
@ -52,6 +61,9 @@ export class EngineConnection {
unreliableDataChannel?: RTCDataChannel unreliableDataChannel?: RTCDataChannel
private ready: boolean private ready: boolean
private connecting: boolean
private dead: boolean
private failedConnTimeout: Timeout | null
readonly url: string readonly url: string
private readonly token?: string private readonly token?: string
@ -87,6 +99,9 @@ export class EngineConnection {
this.url = url this.url = url
this.token = token this.token = token
this.ready = false this.ready = false
this.connecting = false
this.dead = false
this.failedConnTimeout = null
this.onWebsocketOpen = onWebsocketOpen this.onWebsocketOpen = onWebsocketOpen
this.onDataChannelOpen = onDataChannelOpen this.onDataChannelOpen = onDataChannelOpen
this.onEngineConnectionOpen = onEngineConnectionOpen this.onEngineConnectionOpen = onEngineConnectionOpen
@ -97,7 +112,10 @@ export class EngineConnection {
// TODO(paultag): This ought to be tweakable. // TODO(paultag): This ought to be tweakable.
const pingIntervalMs = 10000 const pingIntervalMs = 10000
setInterval(() => { let pingInterval = setInterval(() => {
if (this.dead) {
clearInterval(pingInterval)
}
if (this.isReady()) { if (this.isReady()) {
// When we're online, every 10 seconds, we'll attempt to put a 'ping' // When we're online, every 10 seconds, we'll attempt to put a 'ping'
// command through the WebSocket connection. This will help both ends // command through the WebSocket connection. This will help both ends
@ -106,6 +124,24 @@ export class EngineConnection {
this.send({ type: 'ping' }) this.send({ type: 'ping' })
} }
}, pingIntervalMs) }, pingIntervalMs)
const connectionTimeoutMs = VITE_KC_CONNECTION_TIMEOUT_MS
let connectInterval = setInterval(() => {
if (this.dead) {
clearInterval(connectInterval)
return
}
if (this.isReady()) {
return
}
console.log('connecting via retry')
this.connect()
}, connectionTimeoutMs)
}
// isConnecting will return true when connect has been called, but the full
// WebRTC is not online.
isConnecting() {
return this.connecting
} }
// isReady will return true only when the WebRTC *and* WebSocket connection // isReady will return true only when the WebRTC *and* WebSocket connection
// are connected. During setup, the WebSocket connection comes online first, // are connected. During setup, the WebSocket connection comes online first,
@ -114,6 +150,10 @@ export class EngineConnection {
isReady() { isReady() {
return this.ready return this.ready
} }
tearDown() {
this.dead = true
this.close()
}
// shouldTrace will return true when Sentry should be used to instrument // shouldTrace will return true when Sentry should be used to instrument
// the Engine. // the Engine.
shouldTrace() { shouldTrace() {
@ -125,8 +165,10 @@ export class EngineConnection {
// This will attempt the full handshake, and retry if the connection // This will attempt the full handshake, and retry if the connection
// did not establish. // did not establish.
connect() { connect() {
// TODO(paultag): make this safe to call multiple times, and figure out console.log('connect was called')
// when a connection is in progress (state: connecting or something). if (this.isConnecting() || this.isReady()) {
return
}
// Information on the connect transaction // Information on the connect transaction
@ -358,20 +400,6 @@ export class EngineConnection {
}) })
}) })
} }
// TODO(paultag): This ought to be both controllable, as well as something
// like exponential backoff to have some grace on the backend, as well as
// fix responsiveness for clients that had a weird network hiccup.
const connectionTimeoutMs = VITE_KC_CONNECTION_TIMEOUT_MS
setTimeout(() => {
if (this.isReady()) {
return
}
console.log('engine connection timeout on connection, retrying')
this.close()
this.connect()
}, connectionTimeoutMs)
}) })
this.pc.addEventListener('track', (event) => { this.pc.addEventListener('track', (event) => {
@ -461,6 +489,7 @@ export class EngineConnection {
this.onEngineConnectionOpen(this) this.onEngineConnectionOpen(this)
this.ready = true this.ready = true
this.connecting = false
}) })
this.unreliableDataChannel.addEventListener('close', (event) => { this.unreliableDataChannel.addEventListener('close', (event) => {
@ -474,6 +503,22 @@ export class EngineConnection {
}) })
}) })
const connectionTimeoutMs = VITE_KC_CONNECTION_TIMEOUT_MS
if (this.failedConnTimeout) {
console.log('clearing timeout before set')
clearTimeout(this.failedConnTimeout)
this.failedConnTimeout = null
}
console.log('timeout set')
this.failedConnTimeout = setTimeout(() => {
if (this.isReady()) {
return
}
console.log('engine connection timeout on connection, closing')
this.close()
}, connectionTimeoutMs)
this.onConnectionStarted(this) this.onConnectionStarted(this)
} }
unreliableSend(message: object | string) { unreliableSend(message: object | string) {
@ -498,9 +543,15 @@ export class EngineConnection {
this.pc = undefined this.pc = undefined
this.unreliableDataChannel = undefined this.unreliableDataChannel = undefined
this.webrtcStatsCollector = undefined this.webrtcStatsCollector = undefined
if (this.failedConnTimeout) {
console.log('closed timeout in close')
clearTimeout(this.failedConnTimeout)
this.failedConnTimeout = null
}
this.onClose(this) this.onClose(this)
this.ready = false this.ready = false
this.connecting = false
} }
} }
@ -544,7 +595,12 @@ export class EngineCommandManager {
[localUnsubscribeId: string]: (a: any) => void [localUnsubscribeId: string]: (a: any) => void
} }
} = {} as any } = {} as any
constructor({
constructor() {
this.engineConnection = undefined
}
start({
setMediaStream, setMediaStream,
setIsStreamReady, setIsStreamReady,
width, width,
@ -557,6 +613,16 @@ export class EngineCommandManager {
height: number height: number
token?: string token?: string
}) { }) {
if (width === 0 || height === 0) {
return
}
// If we already have an engine connection, just need to resize the stream.
if (this.engineConnection) {
this.handleResize({ streamWidth: width, streamHeight: height })
return
}
this.waitForReady = new Promise((resolve) => { this.waitForReady = new Promise((resolve) => {
this.resolveReady = resolve this.resolveReady = resolve
}) })
@ -617,6 +683,8 @@ export class EngineCommandManager {
message.request_id message.request_id
) { ) {
this.handleModelingCommand(message.resp, message.request_id) this.handleModelingCommand(message.resp, message.request_id)
} else if (!message.success && message.request_id) {
this.handleFailedModelingCommand(message)
} }
} }
}) })
@ -636,7 +704,35 @@ export class EngineCommandManager {
this.engineConnection?.connect() this.engineConnection?.connect()
} }
handleResize({
streamWidth,
streamHeight,
}: {
streamWidth: number
streamHeight: number
}) {
console.log('handleResize', streamWidth, streamHeight)
if (!this.engineConnection?.isReady()) {
return
}
const resizeCmd: EngineCommand = {
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
type: 'reconfigure_stream',
width: streamWidth,
height: streamHeight,
fps: 60,
},
}
this.engineConnection?.send(resizeCmd)
}
handleModelingCommand(message: WebSocketResponse, id: string) { handleModelingCommand(message: WebSocketResponse, id: string) {
if (this.engineConnection === undefined) {
return
}
if (message.type !== 'modeling') { if (message.type !== 'modeling') {
return return
} }
@ -673,8 +769,40 @@ export class EngineCommandManager {
} }
} }
} }
handleFailedModelingCommand({
request_id,
errors,
}: Models['FailureWebSocketResponse_type']) {
const id = request_id
if (!id) return
const command = this.artifactMap[id]
if (command && command.type === 'pending') {
const resolve = command.resolve
this.artifactMap[id] = {
type: 'failed',
range: command.range,
commandType: command.commandType,
parentId: command.parentId ? command.parentId : undefined,
errors,
}
resolve({
id,
commandType: command.commandType,
range: command.range,
errors,
})
} else {
this.artifactMap[id] = {
type: 'failed',
range: command.range,
commandType: command.commandType,
parentId: command.parentId ? command.parentId : undefined,
errors,
}
}
}
tearDown() { tearDown() {
this.engineConnection?.close() this.engineConnection?.tearDown()
} }
startNewSession() { startNewSession() {
this.artifactMap = {} this.artifactMap = {}
@ -769,6 +897,9 @@ export class EngineCommandManager {
}) })
} }
sendSceneCommand(command: EngineCommand): Promise<any> { sendSceneCommand(command: EngineCommand): Promise<any> {
if (this.engineConnection === undefined) {
return Promise.resolve()
}
if ( if (
command.type === 'modeling_cmd_req' && command.type === 'modeling_cmd_req' &&
command.cmd.type !== lastMessage command.cmd.type !== lastMessage
@ -777,7 +908,6 @@ export class EngineCommandManager {
lastMessage = command.cmd.type lastMessage = command.cmd.type
} }
if (!this.engineConnection?.isReady()) { if (!this.engineConnection?.isReady()) {
console.log('socket not ready')
return Promise.resolve() return Promise.resolve()
} }
if (command.type !== 'modeling_cmd_req') return Promise.resolve() if (command.type !== 'modeling_cmd_req') return Promise.resolve()
@ -821,10 +951,12 @@ export class EngineCommandManager {
range: SourceRange range: SourceRange
command: EngineCommand | string command: EngineCommand | string
}): Promise<any> { }): Promise<any> {
if (this.engineConnection === undefined) {
return Promise.resolve()
}
this.sourceRangeMap[id] = range this.sourceRangeMap[id] = range
if (!this.engineConnection?.isReady()) { if (!this.engineConnection?.isReady()) {
console.log('socket not ready')
return Promise.resolve() return Promise.resolve()
} }
this.engineConnection?.send(command) this.engineConnection?.send(command)
@ -867,6 +999,9 @@ export class EngineCommandManager {
rangeStr: string, rangeStr: string,
commandStr: string commandStr: string
): Promise<any> { ): Promise<any> {
if (this.engineConnection === undefined) {
return Promise.resolve()
}
if (id === undefined) { if (id === undefined) {
throw new Error('id is undefined') throw new Error('id is undefined')
} }
@ -890,6 +1025,8 @@ export class EngineCommandManager {
} }
if (command.type === 'result') { if (command.type === 'result') {
return command.data return command.data
} else if (command.type === 'failed') {
return Promise.resolve(command.errors)
} }
return command.promise return command.promise
} }
@ -915,6 +1052,9 @@ export class EngineCommandManager {
} }
} }
private async fixIdMappings(ast: Program, programMemory: ProgramMemory) { private async fixIdMappings(ast: Program, programMemory: ProgramMemory) {
if (this.engineConnection === undefined) {
return
}
/* This is a temporary solution since the cmd_ids that are sent through when /* This is a temporary solution since the cmd_ids that are sent through when
sending 'extend_path' ids are not used as the segment ids. sending 'extend_path' ids are not used as the segment ids.
@ -994,3 +1134,5 @@ export class EngineCommandManager {
}) })
} }
} }
export const engineCommandManager = new EngineCommandManager()

View File

@ -1279,7 +1279,7 @@ export function getTransformInfos(
}) as TransformInfo[] }) as TransformInfo[]
return theTransforms return theTransforms
} catch (error) { } catch (error) {
console.log(error) console.log('error', error)
return [] return []
} }
} }

View File

@ -11,7 +11,7 @@ export async function asyncLexer(str: string): Promise<Token[]> {
return tokens return tokens
} catch (e) { } catch (e) {
// TODO: do something real with the error. // TODO: do something real with the error.
console.log('lexer', e) console.log('lexer error', e)
throw e throw e
} }
} }
@ -22,7 +22,7 @@ export function lexer(str: string): Token[] {
return tokens return tokens
} catch (e) { } catch (e) {
// TODO: do something real with the error. // TODO: do something real with the error.
console.log('lexer', e) console.log('lexer error', e)
throw e throw e
} }
} }

View File

@ -39,6 +39,6 @@ export async function exportSave(data: ArrayBuffer) {
} }
} catch (e) { } catch (e) {
// TODO: do something real with the error. // TODO: do something real with the error.
console.log('export', e) console.log('export error', e)
} }
} }

View File

@ -36,7 +36,7 @@ export async function initializeProjectDirectory(directory: string) {
try { try {
docDirectory = await documentDir() docDirectory = await documentDir()
} catch (e) { } catch (e) {
console.log(e) console.log('error', e)
docDirectory = await homeDir() // seems to work better on Linux docDirectory = await homeDir() // seems to work better on Linux
} }

View File

@ -5,6 +5,9 @@ import {
EngineCommand, EngineCommand,
} from '../lang/std/engineConnection' } from '../lang/std/engineConnection'
import { SourceRange } from 'lang/executor' import { SourceRange } from 'lang/executor'
import { Models } from '@kittycad/lib'
type WebSocketResponse = Models['OkWebSocketResponseData_type']
class MockEngineCommandManager { class MockEngineCommandManager {
constructor(mockParams: { constructor(mockParams: {
@ -23,7 +26,13 @@ class MockEngineCommandManager {
range: SourceRange range: SourceRange
command: EngineCommand command: EngineCommand
}): Promise<any> { }): Promise<any> {
return Promise.resolve() const response: WebSocketResponse = {
type: 'modeling',
data: {
modeling_response: { type: 'empty' },
},
}
return Promise.resolve(JSON.stringify(response))
} }
sendModelingCommandFromWasm( sendModelingCommandFromWasm(
id: string, id: string,
@ -66,11 +75,12 @@ export async function executor(
ast: Program, ast: Program,
pm: ProgramMemory = { root: {}, return: null } pm: ProgramMemory = { root: {}, return: null }
): Promise<ProgramMemory> { ): Promise<ProgramMemory> {
const engineCommandManager = new EngineCommandManager({ const engineCommandManager = new EngineCommandManager()
engineCommandManager.start({
setIsStreamReady: () => {}, setIsStreamReady: () => {},
setMediaStream: () => {}, setMediaStream: () => {},
width: 100, width: 0,
height: 100, height: 0,
}) })
await engineCommandManager.waitForReady await engineCommandManager.waitForReady
engineCommandManager.startNewSession() engineCommandManager.startNewSession()

View File

@ -19,6 +19,7 @@ import { KCLError } from './lang/errors'
import { deferExecution } from 'lib/utils' import { deferExecution } from 'lib/utils'
import { _executor } from './lang/executor' import { _executor } from './lang/executor'
import { bracket } from 'lib/exampleKcl' import { bracket } from 'lib/exampleKcl'
import { engineCommandManager } from './lang/std/engineConnection'
export type Selection = { export type Selection = {
type: 'default' | 'line-end' | 'line-mid' type: 'default' | 'line-end' | 'line-mid'
@ -162,8 +163,6 @@ export interface StoreState {
setProgramMemory: (programMemory: ProgramMemory) => void setProgramMemory: (programMemory: ProgramMemory) => void
isShiftDown: boolean isShiftDown: boolean
setIsShiftDown: (isShiftDown: boolean) => void setIsShiftDown: (isShiftDown: boolean) => void
engineCommandManager?: EngineCommandManager
setEngineCommandManager: (engineCommandManager: EngineCommandManager) => void
mediaStream?: MediaStream mediaStream?: MediaStream
setMediaStream: (mediaStream: MediaStream) => void setMediaStream: (mediaStream: MediaStream) => void
isStreamReady: boolean isStreamReady: boolean
@ -226,7 +225,7 @@ export const useStore = create<StoreState>()(
const result = await executeCode({ const result = await executeCode({
code: code || get().code, code: code || get().code,
lastAst: get().ast, lastAst: get().ast,
engineCommandManager: get().engineCommandManager, engineCommandManager: engineCommandManager,
}) })
if (!result.isChange) { if (!result.isChange) {
return return
@ -332,8 +331,6 @@ export const useStore = create<StoreState>()(
executeAst: async (ast) => { executeAst: async (ast) => {
const _ast = ast || get().ast const _ast = ast || get().ast
if (!get().isStreamReady) return if (!get().isStreamReady) return
const engineCommandManager = get().engineCommandManager!
if (!engineCommandManager) return
set({ isExecuting: true }) set({ isExecuting: true })
const { logs, errors, programMemory } = await executeAst({ const { logs, errors, programMemory } = await executeAst({
@ -350,8 +347,6 @@ export const useStore = create<StoreState>()(
executeAstMock: async (ast) => { executeAstMock: async (ast) => {
const _ast = ast || get().ast const _ast = ast || get().ast
if (!get().isStreamReady) return if (!get().isStreamReady) return
const engineCommandManager = get().engineCommandManager!
if (!engineCommandManager) return
const { logs, errors, programMemory } = await executeAst({ const { logs, errors, programMemory } = await executeAst({
ast: _ast, ast: _ast,
@ -435,8 +430,6 @@ export const useStore = create<StoreState>()(
setProgramMemory: (programMemory) => set({ programMemory }), setProgramMemory: (programMemory) => set({ programMemory }),
isShiftDown: false, isShiftDown: false,
setIsShiftDown: (isShiftDown) => set({ isShiftDown }), setIsShiftDown: (isShiftDown) => set({ isShiftDown }),
setEngineCommandManager: (engineCommandManager) =>
set({ engineCommandManager }),
setMediaStream: (mediaStream) => set({ mediaStream }), setMediaStream: (mediaStream) => set({ mediaStream }),
isStreamReady: false, isStreamReady: false,
setIsStreamReady: (isStreamReady) => set({ isStreamReady }), setIsStreamReady: (isStreamReady) => set({ isStreamReady }),
@ -454,7 +447,9 @@ export const useStore = create<StoreState>()(
fileId: '', fileId: '',
setFileId: (fileId) => set({ fileId }), setFileId: (fileId) => set({ fileId }),
streamDimensions: { streamWidth: 1280, streamHeight: 720 }, streamDimensions: { streamWidth: 1280, streamHeight: 720 },
setStreamDimensions: (streamDimensions) => set({ streamDimensions }), setStreamDimensions: (streamDimensions) => {
set({ streamDimensions })
},
isExecuting: false, isExecuting: false,
setIsExecuting: (isExecuting) => set({ isExecuting }), setIsExecuting: (isExecuting) => set({ isExecuting }),
@ -519,7 +514,7 @@ async function executeCode({
}: { }: {
code: string code: string
lastAst: Program lastAst: Program
engineCommandManager?: EngineCommandManager engineCommandManager: EngineCommandManager
}): Promise< }): Promise<
| { | {
logs: string[] logs: string[]
@ -539,7 +534,7 @@ async function executeCode({
if (e instanceof KCLError) { if (e instanceof KCLError) {
errors = [e] errors = [e]
logs = [] logs = []
if (e.msg === 'file is empty') engineCommandManager?.endSession() if (e.msg === 'file is empty') engineCommandManager.endSession()
} }
return { return {
isChange: true, isChange: true,
@ -562,7 +557,7 @@ async function executeCode({
} }
// Check if the ast we have is equal to the ast in the storage. // Check if the ast we have is equal to the ast in the storage.
// If it is, we don't need to update the ast. // If it is, we don't need to update the ast.
if (!engineCommandManager || JSON.stringify(ast) === JSON.stringify(lastAst)) if (JSON.stringify(ast) === JSON.stringify(lastAst))
return { isChange: false } return { isChange: false }
const { logs, errors, programMemory } = await executeAst({ const { logs, errors, programMemory } = await executeAst({

210
src/wasm-lib/Cargo.lock generated
View File

@ -63,6 +63,12 @@ dependencies = [
"libc", "libc",
] ]
[[package]]
name = "anes"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4b46cbb362ab8752921c97e041f5e366ee6297bd428a31275b9fcf1e380f7299"
[[package]] [[package]]
name = "anstream" name = "anstream"
version = "0.5.0" version = "0.5.0"
@ -142,6 +148,17 @@ dependencies = [
"thiserror", "thiserror",
] ]
[[package]]
name = "async-recursion"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5fd55a5ba1179988837d24ab4c7cc8ed6efdeff578ede0416b4225a5fca35bd0"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.37",
]
[[package]] [[package]]
name = "async-trait" name = "async-trait"
version = "0.1.73" version = "0.1.73"
@ -211,13 +228,16 @@ checksum = "9ba43ea6f343b788c8764558649e08df62f86c6ef251fdaeb1ffd010a9ae50a2"
[[package]] [[package]]
name = "bigdecimal" name = "bigdecimal"
version = "0.3.1" version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a6773ddc0eafc0e509fb60e48dff7f450f8e674a0686ae8605e8d9901bd5eefa" checksum = "454bca3db10617b88b566f205ed190aedb0e0e6dd4cad61d3988a72e8c5594cb"
dependencies = [ dependencies = [
"autocfg",
"libm",
"num-bigint", "num-bigint",
"num-integer", "num-integer",
"num-traits", "num-traits",
"serde",
] ]
[[package]] [[package]]
@ -337,6 +357,12 @@ dependencies = [
"serde", "serde",
] ]
[[package]]
name = "cast"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "37b2a672a2cb129a2e41c10b1224bb368f9f37a2b16b612598138befd7b37eb5"
[[package]] [[package]]
name = "cc" name = "cc"
version = "1.0.83" version = "1.0.83"
@ -374,6 +400,33 @@ dependencies = [
"windows-targets 0.48.5", "windows-targets 0.48.5",
] ]
[[package]]
name = "ciborium"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "effd91f6c78e5a4ace8a5d3c0b6bfaec9e2baaef55f3efc00e45fb2e477ee926"
dependencies = [
"ciborium-io",
"ciborium-ll",
"serde",
]
[[package]]
name = "ciborium-io"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cdf919175532b369853f5d5e20b26b43112613fd6fe7aee757e35f7a44642656"
[[package]]
name = "ciborium-ll"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "defaa24ecc093c77630e6c15e17c51f5e187bf35ee514f4e2d67baaa96dae22b"
dependencies = [
"ciborium-io",
"half 1.8.2",
]
[[package]] [[package]]
name = "clang-sys" name = "clang-sys"
version = "1.6.1" version = "1.6.1"
@ -507,6 +560,42 @@ dependencies = [
"cfg-if", "cfg-if",
] ]
[[package]]
name = "criterion"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f2b12d017a929603d80db1831cd3a24082f8137ce19c69e6447f54f5fc8d692f"
dependencies = [
"anes",
"cast",
"ciborium",
"clap",
"criterion-plot",
"is-terminal",
"itertools 0.10.5",
"num-traits",
"once_cell",
"oorandom",
"plotters",
"rayon",
"regex",
"serde",
"serde_derive",
"serde_json",
"tinytemplate",
"walkdir",
]
[[package]]
name = "criterion-plot"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6b50826342786a51a89e2da3a28f1c32b06e387201bc2d19791f622c673706b1"
dependencies = [
"cast",
"itertools 0.10.5",
]
[[package]] [[package]]
name = "crossbeam-channel" name = "crossbeam-channel"
version = "0.5.8" version = "0.5.8"
@ -716,7 +805,7 @@ checksum = "832a761f35ab3e6664babfbdc6cef35a4860e816ec3916dcfd0882954e98a8a8"
dependencies = [ dependencies = [
"bit_field", "bit_field",
"flume", "flume",
"half", "half 2.2.1",
"lebe", "lebe",
"miniz_oxide", "miniz_oxide",
"rayon-core", "rayon-core",
@ -983,6 +1072,12 @@ dependencies = [
"tracing", "tracing",
] ]
[[package]]
name = "half"
version = "1.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eabb4a44450da02c90444cf74558da904edde8fb4e9035a9a6a4e15445af0bd7"
[[package]] [[package]]
name = "half" name = "half"
version = "2.2.1" version = "2.2.1"
@ -1281,12 +1376,14 @@ dependencies = [
[[package]] [[package]]
name = "kcl-lib" name = "kcl-lib"
version = "0.1.30" version = "0.1.31"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"async-recursion",
"async-trait", "async-trait",
"bson", "bson",
"clap", "clap",
"criterion",
"dashmap", "dashmap",
"derive-docs", "derive-docs",
"expectorate", "expectorate",
@ -1297,7 +1394,6 @@ dependencies = [
"lazy_static", "lazy_static",
"parse-display", "parse-display",
"pretty_assertions", "pretty_assertions",
"regex",
"reqwest", "reqwest",
"schemars", "schemars",
"serde", "serde",
@ -1311,17 +1407,19 @@ dependencies = [
"wasm-bindgen", "wasm-bindgen",
"wasm-bindgen-futures", "wasm-bindgen-futures",
"web-sys", "web-sys",
"winnow",
] ]
[[package]] [[package]]
name = "kittycad" name = "kittycad"
version = "0.2.25" version = "0.2.26"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9cf962b1e81a0b4eb923a727e761b40672cbacc7f5f0b75e13579d346352bc7" checksum = "e2623ee601ce203476229df3f9d3a14664cb43e3f7455e9ac8ed91aacaa6163d"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"async-trait", "async-trait",
"base64 0.21.4", "base64 0.21.4",
"bigdecimal",
"bytes", "bytes",
"chrono", "chrono",
"data-encoding", "data-encoding",
@ -1383,6 +1481,12 @@ dependencies = [
"winapi", "winapi",
] ]
[[package]]
name = "libm"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7012b1bbb0719e1097c47611d3898568c546d597c2e74d66f6087edd5233ff4"
[[package]] [[package]]
name = "linked-hash-map" name = "linked-hash-map"
version = "0.5.6" version = "0.5.6"
@ -1606,10 +1710,16 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "44d11de466f4a3006fe8a5e7ec84e93b79c70cb992ae0aa0eb631ad2df8abfe2" checksum = "44d11de466f4a3006fe8a5e7ec84e93b79c70cb992ae0aa0eb631ad2df8abfe2"
[[package]]
name = "oorandom"
version = "11.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0ab1bc2a289d34bd04a330323ac98a1b4bc82c9d9fcb1e66b63caa84da26b575"
[[package]] [[package]]
name = "openapitor" name = "openapitor"
version = "0.0.9" version = "0.0.9"
source = "git+https://github.com/KittyCAD/kittycad.rs?branch=main#0d121f6881da91b4a30bee18bbfe50e4a2096073" source = "git+https://github.com/KittyCAD/kittycad.rs?branch=main#61a16059b3eaf8793a2a2e1edbc0d770f284fea3"
dependencies = [ dependencies = [
"Inflector", "Inflector",
"anyhow", "anyhow",
@ -1783,14 +1893,14 @@ dependencies = [
[[package]] [[package]]
name = "phonenumber" name = "phonenumber"
version = "0.3.2+8.13.9" version = "0.3.3+8.13.9"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34749f64ea9d76f10cdc8a859588b57775f59177c7dd91f744d620bd62982d6f" checksum = "635f3e6288e4f01c049d89332a031bd74f25d64b6fb94703ca966e819488cd06"
dependencies = [ dependencies = [
"bincode", "bincode",
"either", "either",
"fnv", "fnv",
"itertools 0.10.5", "itertools 0.11.0",
"lazy_static", "lazy_static",
"nom", "nom",
"quick-xml", "quick-xml",
@ -1798,6 +1908,7 @@ dependencies = [
"regex-cache", "regex-cache",
"serde", "serde",
"serde_derive", "serde_derive",
"strum",
"thiserror", "thiserror",
] ]
@ -1839,6 +1950,34 @@ version = "0.3.27"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "26072860ba924cbfa98ea39c8c19b4dd6a4a25423dbdf219c1eca91aa0cf6964" checksum = "26072860ba924cbfa98ea39c8c19b4dd6a4a25423dbdf219c1eca91aa0cf6964"
[[package]]
name = "plotters"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d2c224ba00d7cadd4d5c660deaf2098e5e80e07846537c51f9cfa4be50c1fd45"
dependencies = [
"num-traits",
"plotters-backend",
"plotters-svg",
"wasm-bindgen",
"web-sys",
]
[[package]]
name = "plotters-backend"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9e76628b4d3a7581389a35d5b6e2139607ad7c75b17aed325f210aa91f4a9609"
[[package]]
name = "plotters-svg"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38f6d39893cca0701371e3c27294f09797214b86f1fb951b89ade8ec04e2abab"
dependencies = [
"plotters-backend",
]
[[package]] [[package]]
name = "png" name = "png"
version = "0.17.10" version = "0.17.10"
@ -2696,6 +2835,28 @@ dependencies = [
"syn 2.0.37", "syn 2.0.37",
] ]
[[package]]
name = "strum"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "063e6045c0e62079840579a7e47a355ae92f60eb74daaf156fb1e84ba164e63f"
dependencies = [
"strum_macros",
]
[[package]]
name = "strum_macros"
version = "0.24.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e385be0d24f186b4ce2f9982191e7101bb737312ad61c1f2f984f34bcf85d59"
dependencies = [
"heck",
"proc-macro2",
"quote",
"rustversion",
"syn 1.0.109",
]
[[package]] [[package]]
name = "syn" name = "syn"
version = "1.0.109" version = "1.0.109"
@ -2853,6 +3014,16 @@ dependencies = [
"time-core", "time-core",
] ]
[[package]]
name = "tinytemplate"
version = "1.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "be4d6b5f19ff7664e8c98d03e2139cb510db9b0a60b55f8e8709b689d939b6bc"
dependencies = [
"serde",
"serde_json",
]
[[package]] [[package]]
name = "tinyvec" name = "tinyvec"
version = "1.6.0" version = "1.6.0"
@ -2910,9 +3081,9 @@ dependencies = [
[[package]] [[package]]
name = "tokio-tungstenite" name = "tokio-tungstenite"
version = "0.20.0" version = "0.20.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2b2dbec703c26b00d74844519606ef15d09a7d6857860f84ad223dec002ddea2" checksum = "212d5dcb2a1ce06d81107c3d0ffa3121fe974b73f068c8282cb1c32328113b6c"
dependencies = [ dependencies = [
"futures-util", "futures-util",
"log", "log",
@ -3132,9 +3303,9 @@ dependencies = [
[[package]] [[package]]
name = "tungstenite" name = "tungstenite"
version = "0.20.0" version = "0.20.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e862a1c4128df0112ab625f55cd5c934bcb4312ba80b39ae4b4835a3fd58e649" checksum = "9e3dac10fd62eaf6617d3a904ae222845979aec67c615d1c842b4002c7666fb9"
dependencies = [ dependencies = [
"byteorder", "byteorder",
"bytes", "bytes",
@ -3621,6 +3792,15 @@ version = "0.48.5"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538" checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538"
[[package]]
name = "winnow"
version = "0.5.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c2e3184b9c4e92ad5167ca73039d0c42476302ab603e2fec4487511f38ccefc"
dependencies = [
"memchr",
]
[[package]] [[package]]
name = "winreg" name = "winreg"
version = "0.50.0" version = "0.50.0"

View File

@ -79,13 +79,6 @@ fn do_stdlib_inner(
)); ));
} }
if ast.sig.asyncness.is_some() {
errors.push(Error::new_spanned(
&ast.sig.fn_token,
"stdlib functions must not be async",
));
}
if ast.sig.unsafety.is_some() { if ast.sig.unsafety.is_some() {
errors.push(Error::new_spanned( errors.push(Error::new_spanned(
&ast.sig.unsafety, &ast.sig.unsafety,
@ -118,6 +111,7 @@ fn do_stdlib_inner(
let fn_name = &ast.sig.ident; let fn_name = &ast.sig.ident;
let fn_name_str = fn_name.to_string().replace("inner_", ""); let fn_name_str = fn_name.to_string().replace("inner_", "");
let fn_name_ident = format_ident!("{}", fn_name_str); let fn_name_ident = format_ident!("{}", fn_name_str);
let boxed_fn_name_ident = format_ident!("boxed_{}", fn_name_str);
let _visibility = &ast.vis; let _visibility = &ast.vis;
let (summary_text, description_text) = extract_doc_from_attrs(&ast.attrs); let (summary_text, description_text) = extract_doc_from_attrs(&ast.attrs);
@ -204,7 +198,10 @@ fn do_stdlib_inner(
syn::FnArg::Typed(pat) => pat.ty.as_ref().into_token_stream(), syn::FnArg::Typed(pat) => pat.ty.as_ref().into_token_stream(),
}; };
let ty_string = ty.to_string().replace('&', "").replace("mut", "").replace(' ', ""); let mut ty_string = ty.to_string().replace('&', "").replace("mut", "").replace(' ', "");
if ty_string.starts_with("Args") {
ty_string = "Args".to_string();
}
let ty_string = ty_string.trim().to_string(); let ty_string = ty_string.trim().to_string();
let ty_ident = if ty_string.starts_with("Vec<") { let ty_ident = if ty_string.starts_with("Vec<") {
let ty_string = ty_string.trim_start_matches("Vec<").trim_end_matches('>'); let ty_string = ty_string.trim_start_matches("Vec<").trim_end_matches('>');
@ -305,6 +302,14 @@ fn do_stdlib_inner(
#description_doc_comment #description_doc_comment
#const_struct #const_struct
fn #boxed_fn_name_ident(
args: crate::std::Args,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>>>,
> {
Box::pin(#fn_name_ident(args))
}
impl #docs_crate::StdLibFn for #name_ident impl #docs_crate::StdLibFn for #name_ident
{ {
fn name(&self) -> String { fn name(&self) -> String {
@ -348,7 +353,7 @@ fn do_stdlib_inner(
} }
fn std_lib_fn(&self) -> crate::std::StdFn { fn std_lib_fn(&self) -> crate::std::StdFn {
#fn_name_ident #boxed_fn_name_ident
} }
fn clone_box(&self) -> Box<dyn #docs_crate::StdLibFn> { fn clone_box(&self) -> Box<dyn #docs_crate::StdLibFn> {

View File

@ -7,6 +7,18 @@ pub(crate) struct Show {}
#[allow(non_upper_case_globals, missing_docs)] #[allow(non_upper_case_globals, missing_docs)]
#[doc = "Std lib function: show"] #[doc = "Std lib function: show"]
pub(crate) const Show: Show = Show {}; pub(crate) const Show: Show = Show {};
fn boxed_show(
args: crate::std::Args,
) -> std::pin::Pin<
Box<
dyn std::future::Future<
Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>,
>,
>,
> {
Box::pin(show(args))
}
impl crate::docs::StdLibFn for Show { impl crate::docs::StdLibFn for Show {
fn name(&self) -> String { fn name(&self) -> String {
"show".to_string() "show".to_string()
@ -57,7 +69,7 @@ impl crate::docs::StdLibFn for Show {
} }
fn std_lib_fn(&self) -> crate::std::StdFn { fn std_lib_fn(&self) -> crate::std::StdFn {
show boxed_show
} }
fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> { fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> {

View File

@ -7,6 +7,18 @@ pub(crate) struct LineTo {}
#[allow(non_upper_case_globals, missing_docs)] #[allow(non_upper_case_globals, missing_docs)]
#[doc = "Std lib function: lineTo"] #[doc = "Std lib function: lineTo"]
pub(crate) const LineTo: LineTo = LineTo {}; pub(crate) const LineTo: LineTo = LineTo {};
fn boxed_line_to(
args: crate::std::Args,
) -> std::pin::Pin<
Box<
dyn std::future::Future<
Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>,
>,
>,
> {
Box::pin(line_to(args))
}
impl crate::docs::StdLibFn for LineTo { impl crate::docs::StdLibFn for LineTo {
fn name(&self) -> String { fn name(&self) -> String {
"lineTo".to_string() "lineTo".to_string()
@ -65,7 +77,7 @@ impl crate::docs::StdLibFn for LineTo {
} }
fn std_lib_fn(&self) -> crate::std::StdFn { fn std_lib_fn(&self) -> crate::std::StdFn {
line_to boxed_line_to
} }
fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> { fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> {

View File

@ -7,6 +7,18 @@ pub(crate) struct Min {}
#[allow(non_upper_case_globals, missing_docs)] #[allow(non_upper_case_globals, missing_docs)]
#[doc = "Std lib function: min"] #[doc = "Std lib function: min"]
pub(crate) const Min: Min = Min {}; pub(crate) const Min: Min = Min {};
fn boxed_min(
args: crate::std::Args,
) -> std::pin::Pin<
Box<
dyn std::future::Future<
Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>,
>,
>,
> {
Box::pin(min(args))
}
impl crate::docs::StdLibFn for Min { impl crate::docs::StdLibFn for Min {
fn name(&self) -> String { fn name(&self) -> String {
"min".to_string() "min".to_string()
@ -57,7 +69,7 @@ impl crate::docs::StdLibFn for Min {
} }
fn std_lib_fn(&self) -> crate::std::StdFn { fn std_lib_fn(&self) -> crate::std::StdFn {
min boxed_min
} }
fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> { fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> {

View File

@ -7,6 +7,18 @@ pub(crate) struct Show {}
#[allow(non_upper_case_globals, missing_docs)] #[allow(non_upper_case_globals, missing_docs)]
#[doc = "Std lib function: show"] #[doc = "Std lib function: show"]
pub(crate) const Show: Show = Show {}; pub(crate) const Show: Show = Show {};
fn boxed_show(
args: crate::std::Args,
) -> std::pin::Pin<
Box<
dyn std::future::Future<
Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>,
>,
>,
> {
Box::pin(show(args))
}
impl crate::docs::StdLibFn for Show { impl crate::docs::StdLibFn for Show {
fn name(&self) -> String { fn name(&self) -> String {
"show".to_string() "show".to_string()
@ -52,7 +64,7 @@ impl crate::docs::StdLibFn for Show {
} }
fn std_lib_fn(&self) -> crate::std::StdFn { fn std_lib_fn(&self) -> crate::std::StdFn {
show boxed_show
} }
fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> { fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> {

View File

@ -1,7 +1,7 @@
[package] [package]
name = "kcl-lib" name = "kcl-lib"
description = "KittyCAD Language" description = "KittyCAD Language"
version = "0.1.30" version = "0.1.31"
edition = "2021" edition = "2021"
license = "MIT" license = "MIT"
@ -9,6 +9,7 @@ license = "MIT"
[dependencies] [dependencies]
anyhow = { version = "1.0.75", features = ["backtrace"] } anyhow = { version = "1.0.75", features = ["backtrace"] }
async-recursion = "1.0.5"
async-trait = "0.1.73" async-trait = "0.1.73"
clap = { version = "4.4.3", features = ["cargo", "derive", "env", "unicode"], optional = true } clap = { version = "4.4.3", features = ["cargo", "derive", "env", "unicode"], optional = true }
dashmap = "5.5.3" dashmap = "5.5.3"
@ -17,13 +18,13 @@ derive-docs = { path = "../derive-docs" }
kittycad = { version = "0.2.25", default-features = false, features = ["js"] } kittycad = { version = "0.2.25", default-features = false, features = ["js"] }
lazy_static = "1.4.0" lazy_static = "1.4.0"
parse-display = "0.8.2" parse-display = "0.8.2"
regex = "1.7.1"
schemars = { version = "0.8", features = ["impl_json_schema", "url", "uuid1"] } schemars = { version = "0.8", features = ["impl_json_schema", "url", "uuid1"] }
serde = { version = "1.0.188", features = ["derive"] } serde = { version = "1.0.188", features = ["derive"] }
serde_json = "1.0.107" serde_json = "1.0.107"
thiserror = "1.0.48" thiserror = "1.0.48"
ts-rs = { version = "7", package = "ts-rs-json-value", features = ["serde-json-impl", "schemars-impl", "uuid-impl"] } ts-rs = { version = "7", package = "ts-rs-json-value", features = ["serde-json-impl", "schemars-impl", "uuid-impl"] }
uuid = { version = "1.4.1", features = ["v4", "js", "serde"] } uuid = { version = "1.4.1", features = ["v4", "js", "serde"] }
winnow = "0.5.15"
[target.'cfg(target_arch = "wasm32")'.dependencies] [target.'cfg(target_arch = "wasm32")'.dependencies]
js-sys = { version = "0.3.64" } js-sys = { version = "0.3.64" }
@ -50,7 +51,12 @@ panic = "abort"
debug = true debug = true
[dev-dependencies] [dev-dependencies]
criterion = "0.5.1"
expectorate = "1.0.7" expectorate = "1.0.7"
itertools = "0.11.0" itertools = "0.11.0"
pretty_assertions = "1.4.0" pretty_assertions = "1.4.0"
tokio = { version = "1.32.0", features = ["rt-multi-thread", "macros", "time"] } tokio = { version = "1.32.0", features = ["rt-multi-thread", "macros", "time"] }
[[bench]]
name = "compiler_benchmark"
harness = false

View File

@ -0,0 +1,41 @@
use criterion::{black_box, criterion_group, criterion_main, Criterion};
pub fn bench_lex(c: &mut Criterion) {
c.bench_function("lex_cube", |b| b.iter(|| lex(CUBE_PROGRAM)));
c.bench_function("lex_big_kitt", |b| b.iter(|| lex(KITT_PROGRAM)));
c.bench_function("lex_pipes_on_pipes", |b| b.iter(|| lex(PIPES_PROGRAM)));
}
pub fn bench_lex_parse(c: &mut Criterion) {
c.bench_function("parse_lex_cube", |b| b.iter(|| lex_and_parse(CUBE_PROGRAM)));
c.bench_function("parse_lex_big_kitt", |b| b.iter(|| lex_and_parse(KITT_PROGRAM)));
c.bench_function("parse_lex_pipes_on_pipes", |b| b.iter(|| lex_and_parse(PIPES_PROGRAM)));
}
fn lex(program: &str) {
black_box(kcl_lib::token::lexer(program));
}
fn lex_and_parse(program: &str) {
let tokens = kcl_lib::token::lexer(program);
let parser = kcl_lib::parser::Parser::new(tokens);
black_box(parser.ast().unwrap());
}
criterion_group!(benches, bench_lex, bench_lex_parse);
criterion_main!(benches);
const KITT_PROGRAM: &str = include_str!("../../tests/executor/inputs/kittycad_svg.kcl");
const PIPES_PROGRAM: &str = include_str!("../../tests/executor/inputs/pipes_on_pipes.kcl");
const CUBE_PROGRAM: &str = r#"fn cube = (pos, scale) => {
const sg = startSketchAt(pos)
|> line([0, scale], %)
|> line([scale, 0], %)
|> line([0, -scale], %)
return sg
}
const b1 = cube([0,0], 10)
const pt1 = b1[0]
show(b1)"#;

View File

@ -0,0 +1 @@
enum-variant-size-threshold = 24

View File

@ -44,54 +44,6 @@ dependencies = [
"memchr", "memchr",
] ]
[[package]]
name = "anstream"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1f58811cfac344940f1a400b6e6231ce35171f614f26439e80f8c1465c5cc0c"
dependencies = [
"anstyle",
"anstyle-parse",
"anstyle-query",
"anstyle-wincon",
"colorchoice",
"utf8parse",
]
[[package]]
name = "anstyle"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "15c4c2c83f81532e5845a733998b6971faca23490340a418e9b72a3ec9de12ea"
[[package]]
name = "anstyle-parse"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "938874ff5980b03a87c5524b3ae5b59cf99b1d6bc836848df7bc5ada9643c333"
dependencies = [
"utf8parse",
]
[[package]]
name = "anstyle-query"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5ca11d4be1bab0c8bc8734a9aa7bf4ee8316d462a08c6ac5052f888fef5b494b"
dependencies = [
"windows-sys",
]
[[package]]
name = "anstyle-wincon"
version = "2.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "58f54d10c6dfa51283a066ceab3ec1ab78d13fae00aa49243a45e4571fb79dfd"
dependencies = [
"anstyle",
"windows-sys",
]
[[package]] [[package]]
name = "anyhow" name = "anyhow"
version = "1.0.75" version = "1.0.75"
@ -123,6 +75,17 @@ dependencies = [
"thiserror", "thiserror",
] ]
[[package]]
name = "async-recursion"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5fd55a5ba1179988837d24ab4c7cc8ed6efdeff578ede0416b4225a5fca35bd0"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.37",
]
[[package]] [[package]]
name = "async-trait" name = "async-trait"
version = "0.1.73" version = "0.1.73"
@ -131,7 +94,7 @@ checksum = "bc00ceb34980c03614e35a3a4e218276a0a824e911d07651cd0d858a51e8c0f0"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -179,6 +142,20 @@ version = "0.21.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "414dcefbc63d77c526a76b3afcf6fbb9b5e2791c19c3aa2297733208750c6e53" checksum = "414dcefbc63d77c526a76b3afcf6fbb9b5e2791c19c3aa2297733208750c6e53"
[[package]]
name = "bigdecimal"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "454bca3db10617b88b566f205ed190aedb0e0e6dd4cad61d3988a72e8c5594cb"
dependencies = [
"autocfg",
"libm",
"num-bigint",
"num-integer",
"num-traits",
"serde",
]
[[package]] [[package]]
name = "bincode" name = "bincode"
version = "1.3.3" version = "1.3.3"
@ -284,54 +261,6 @@ dependencies = [
"serde", "serde",
] ]
[[package]]
name = "clap"
version = "4.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6a13b88d2c62ff462f88e4a121f17a82c1af05693a2f192b5c38d14de73c19f6"
dependencies = [
"clap_builder",
"clap_derive",
]
[[package]]
name = "clap_builder"
version = "4.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2bb9faaa7c2ef94b2743a21f5a29e6f0010dff4caa69ac8e9d6cf8b6fa74da08"
dependencies = [
"anstream",
"anstyle",
"clap_lex",
"strsim",
"unicase",
"unicode-width",
]
[[package]]
name = "clap_derive"
version = "4.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0862016ff20d69b84ef8247369fabf5c008a7417002411897d40ee1f4532b873"
dependencies = [
"heck",
"proc-macro2",
"quote",
"syn 2.0.31",
]
[[package]]
name = "clap_lex"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd7cc57abe963c6d3b9d8be5b06ba7c8957a930305ca90304f24ef040aa6f961"
[[package]]
name = "colorchoice"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "acbf1af155f9b9ef647e42cdc158db4b64a1b61f743629225fde6f3e0be2a7c7"
[[package]] [[package]]
name = "convert_case" name = "convert_case"
version = "0.6.0" version = "0.6.0"
@ -403,16 +332,14 @@ checksum = "f2696e8a945f658fd14dc3b87242e6b80cd0f36ff04ea560fa39082368847946"
[[package]] [[package]]
name = "derive-docs" name = "derive-docs"
version = "0.1.3" version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5fe5c5ea065cfabc5a7c5e8ed616e369fbf108c4be01e0e5609bc9846a732664"
dependencies = [ dependencies = [
"convert_case", "convert_case",
"proc-macro2", "proc-macro2",
"quote", "quote",
"serde", "serde",
"serde_tokenstream", "serde_tokenstream",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -529,7 +456,7 @@ checksum = "89ca545a94061b6365f2c7355b4b32bd20df3ff95f02da9329b34ccc3bd6ee72"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -769,11 +696,12 @@ dependencies = [
[[package]] [[package]]
name = "kcl-lib" name = "kcl-lib"
version = "0.1.24" version = "0.1.31"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"async-recursion",
"async-trait",
"bson", "bson",
"clap",
"dashmap", "dashmap",
"derive-docs", "derive-docs",
"futures", "futures",
@ -781,7 +709,6 @@ dependencies = [
"kittycad", "kittycad",
"lazy_static", "lazy_static",
"parse-display", "parse-display",
"regex",
"reqwest", "reqwest",
"schemars", "schemars",
"serde", "serde",
@ -794,6 +721,8 @@ dependencies = [
"uuid", "uuid",
"wasm-bindgen", "wasm-bindgen",
"wasm-bindgen-futures", "wasm-bindgen-futures",
"web-sys",
"winnow",
] ]
[[package]] [[package]]
@ -806,12 +735,13 @@ dependencies = [
[[package]] [[package]]
name = "kittycad" name = "kittycad"
version = "0.2.24" version = "0.2.26"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd2f78e95054c83ab77059fe0237b128e2b241b4b8e9466e452d701c2599f3d3" checksum = "e2623ee601ce203476229df3f9d3a14664cb43e3f7455e9ac8ed91aacaa6163d"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"base64 0.21.3", "base64 0.21.3",
"bigdecimal",
"bytes", "bytes",
"chrono", "chrono",
"data-encoding", "data-encoding",
@ -850,6 +780,12 @@ dependencies = [
"once_cell", "once_cell",
] ]
[[package]]
name = "libm"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7012b1bbb0719e1097c47611d3898568c546d597c2e74d66f6087edd5233ff4"
[[package]] [[package]]
name = "linked-hash-map" name = "linked-hash-map"
version = "0.5.6" version = "0.5.6"
@ -942,6 +878,27 @@ dependencies = [
"minimal-lexical", "minimal-lexical",
] ]
[[package]]
name = "num-bigint"
version = "0.4.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "608e7659b5c3d7cba262d894801b9ec9d00de989e8a82bd4bef91d08da45cdc0"
dependencies = [
"autocfg",
"num-integer",
"num-traits",
]
[[package]]
name = "num-integer"
version = "0.1.45"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "225d3389fb3509a24c93f5c29eb6bde2586b98d9f016636dff58d7c6f7569cd9"
dependencies = [
"autocfg",
"num-traits",
]
[[package]] [[package]]
name = "num-traits" name = "num-traits"
version = "0.2.16" version = "0.2.16"
@ -1034,7 +991,7 @@ dependencies = [
"regex", "regex",
"regex-syntax 0.7.5", "regex-syntax 0.7.5",
"structmeta", "structmeta",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1045,9 +1002,9 @@ checksum = "9b2a4787296e9989611394c33f193f676704af1686e70b8f8033ab5ba9a35a94"
[[package]] [[package]]
name = "phonenumber" name = "phonenumber"
version = "0.3.2+8.13.9" version = "0.3.3+8.13.9"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34749f64ea9d76f10cdc8a859588b57775f59177c7dd91f744d620bd62982d6f" checksum = "635f3e6288e4f01c049d89332a031bd74f25d64b6fb94703ca966e819488cd06"
dependencies = [ dependencies = [
"bincode", "bincode",
"either", "either",
@ -1060,6 +1017,7 @@ dependencies = [
"regex-cache", "regex-cache",
"serde", "serde",
"serde_derive", "serde_derive",
"strum",
"thiserror", "thiserror",
] ]
@ -1080,7 +1038,7 @@ checksum = "4359fd9c9171ec6e8c62926d6faaf553a8dc3f64e1507e76da7911b4f6a04405"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1127,9 +1085,9 @@ dependencies = [
[[package]] [[package]]
name = "proc-macro2" name = "proc-macro2"
version = "1.0.66" version = "1.0.67"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "18fb31db3f9bddb2ea821cde30a9f70117e3f119938b5ee630b7403aa6e2ead9" checksum = "3d433d9f1a3e8c1263d9456598b16fec66f4acc9a74dacffd35c7bb09b3a1328"
dependencies = [ dependencies = [
"unicode-ident", "unicode-ident",
] ]
@ -1342,6 +1300,12 @@ dependencies = [
"untrusted", "untrusted",
] ]
[[package]]
name = "rustversion"
version = "1.0.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7ffc183a10b4478d04cbbbfc96d0873219d962dd5accaff2ffbd4ceb7df837f4"
[[package]] [[package]]
name = "ryu" name = "ryu"
version = "1.0.15" version = "1.0.15"
@ -1359,10 +1323,11 @@ dependencies = [
[[package]] [[package]]
name = "schemars" name = "schemars"
version = "0.8.13" version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "763f8cd0d4c71ed8389c90cb8100cba87e763bd01a8e614d4f0af97bcd50a161" checksum = "1f7b0ce13155372a76ee2e1c5ffba1fe61ede73fbea5630d61eee6fac4929c0c"
dependencies = [ dependencies = [
"bigdecimal",
"bytes", "bytes",
"chrono", "chrono",
"dyn-clone", "dyn-clone",
@ -1375,9 +1340,9 @@ dependencies = [
[[package]] [[package]]
name = "schemars_derive" name = "schemars_derive"
version = "0.8.13" version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec0f696e21e10fa546b7ffb1c9672c6de8fbc7a81acf59524386d8639bf12737" checksum = "e85e2a16b12bdb763244c69ab79363d71db2b4b918a2def53f80b02e0574b13c"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -1450,7 +1415,7 @@ checksum = "4eca7ac642d82aa35b60049a6eccb4be6be75e599bd2e9adb5f875a737654af2"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1466,9 +1431,9 @@ dependencies = [
[[package]] [[package]]
name = "serde_json" name = "serde_json"
version = "1.0.105" version = "1.0.107"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "693151e1ac27563d6dbcec9dee9fbd5da8539b20fa14ad3752b2e6d363ace360" checksum = "6b420ce6e3d8bd882e9b243c6eed35dbc9a6110c9769e74b584e0d68d1f20c65"
dependencies = [ dependencies = [
"indexmap 2.0.0", "indexmap 2.0.0",
"itoa", "itoa",
@ -1484,7 +1449,7 @@ checksum = "8725e1dfadb3a50f7e5ce0b1a540466f6ed3fe7a0fca2ac2b8b831d31316bd00"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1496,7 +1461,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"serde", "serde",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1572,12 +1537,6 @@ version = "0.5.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6e63cff320ae2c57904679ba7cb63280a3dc4613885beafb148ee7bf9aa9042d" checksum = "6e63cff320ae2c57904679ba7cb63280a3dc4613885beafb148ee7bf9aa9042d"
[[package]]
name = "strsim"
version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "73473c0e59e6d5812c5dfe2a064a6444949f089e20eec9a2e5506596494e4623"
[[package]] [[package]]
name = "structmeta" name = "structmeta"
version = "0.2.0" version = "0.2.0"
@ -1587,7 +1546,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"structmeta-derive", "structmeta-derive",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1598,7 +1557,29 @@ checksum = "a60bcaff7397072dca0017d1db428e30d5002e00b6847703e2e42005c95fbe00"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
]
[[package]]
name = "strum"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "063e6045c0e62079840579a7e47a355ae92f60eb74daaf156fb1e84ba164e63f"
dependencies = [
"strum_macros",
]
[[package]]
name = "strum_macros"
version = "0.24.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e385be0d24f186b4ce2f9982191e7101bb737312ad61c1f2f984f34bcf85d59"
dependencies = [
"heck",
"proc-macro2",
"quote",
"rustversion",
"syn 1.0.109",
] ]
[[package]] [[package]]
@ -1614,9 +1595,9 @@ dependencies = [
[[package]] [[package]]
name = "syn" name = "syn"
version = "2.0.31" version = "2.0.37"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "718fa2415bcb8d8bd775917a1bf12a7931b6dfa890753378538118181e0cb398" checksum = "7303ef2c05cd654186cb250d29049a24840ca25d2747c25c0381c8d9e2f582e8"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -1655,7 +1636,7 @@ checksum = "49922ecae66cc8a249b77e68d1d0623c1b2c514f0060c27cdc68bd62a1219d35"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1728,7 +1709,7 @@ checksum = "630bdcf245f78637c13ec01ffae6187cca34625e8c63150d424b59e55af2675e"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1822,7 +1803,7 @@ checksum = "84fd902d4e0b9a4b27f2f440108dc034e1758628a9b702f8ec61ad66355422fa"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1851,7 +1832,7 @@ checksum = "5f4f31f56159e98206da9efd823404b79b6ef3143b4a7ab76e67b1751b25a4ab"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
] ]
[[package]] [[package]]
@ -1891,15 +1872,15 @@ dependencies = [
"Inflector", "Inflector",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
"termcolor", "termcolor",
] ]
[[package]] [[package]]
name = "tungstenite" name = "tungstenite"
version = "0.20.0" version = "0.20.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e862a1c4128df0112ab625f55cd5c934bcb4312ba80b39ae4b4835a3fd58e649" checksum = "9e3dac10fd62eaf6617d3a904ae222845979aec67c615d1c842b4002c7666fb9"
dependencies = [ dependencies = [
"byteorder", "byteorder",
"bytes", "bytes",
@ -1921,15 +1902,6 @@ version = "1.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "497961ef93d974e23eb6f433eb5fe1b7930b659f06d12dec6fc44a8f554c0bba" checksum = "497961ef93d974e23eb6f433eb5fe1b7930b659f06d12dec6fc44a8f554c0bba"
[[package]]
name = "unicase"
version = "2.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7d2d4dafb69621809a81864c9c1b864479e1235c0dd4e199924b9742439ed89"
dependencies = [
"version_check",
]
[[package]] [[package]]
name = "unicode-bidi" name = "unicode-bidi"
version = "0.3.13" version = "0.3.13"
@ -1957,12 +1929,6 @@ version = "1.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1dd624098567895118886609431a7c3b8f516e41d30e0643f03d94592a147e36" checksum = "1dd624098567895118886609431a7c3b8f516e41d30e0643f03d94592a147e36"
[[package]]
name = "unicode-width"
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c0edd1e5b14653f783770bce4a4dabb4a5108a5370a5f5d8cfe8710c361f6c8b"
[[package]] [[package]]
name = "untrusted" name = "untrusted"
version = "0.7.1" version = "0.7.1"
@ -1987,12 +1953,6 @@ version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09cc8ee72d2a9becf2f2febe0205bbed8fc6615b7cb429ad062dc7b7ddd036a9" checksum = "09cc8ee72d2a9becf2f2febe0205bbed8fc6615b7cb429ad062dc7b7ddd036a9"
[[package]]
name = "utf8parse"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "711b9620af191e0cdc7468a8d14e709c3dcdb115b36f838e601583af800a370a"
[[package]] [[package]]
name = "uuid" name = "uuid"
version = "1.4.1" version = "1.4.1"
@ -2046,7 +2006,7 @@ dependencies = [
"once_cell", "once_cell",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
"wasm-bindgen-shared", "wasm-bindgen-shared",
] ]
@ -2080,7 +2040,7 @@ checksum = "54681b18a46765f095758388f2d0cf16eb8d4169b639ab575a8f5693af210c7b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.31", "syn 2.0.37",
"wasm-bindgen-backend", "wasm-bindgen-backend",
"wasm-bindgen-shared", "wasm-bindgen-shared",
] ]
@ -2198,6 +2158,15 @@ version = "0.48.5"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538" checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538"
[[package]]
name = "winnow"
version = "0.5.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c2e3184b9c4e92ad5167ca73039d0c42476302ab603e2fec4487511f38ccefc"
dependencies = [
"memchr",
]
[[package]] [[package]]
name = "winreg" name = "winreg"
version = "0.50.0" version = "0.50.0"

View File

@ -71,7 +71,7 @@ pub async fn modify_ast_for_sketch(
// Let's get the path info. // Let's get the path info.
let resp = engine let resp = engine
.send_modeling_cmd_get_response( .send_modeling_cmd(
uuid::Uuid::new_v4(), uuid::Uuid::new_v4(),
SourceRange::default(), SourceRange::default(),
ModelingCmd::PathGetInfo { path_id: sketch_id }, ModelingCmd::PathGetInfo { path_id: sketch_id },
@ -88,47 +88,6 @@ pub async fn modify_ast_for_sketch(
})); }));
}; };
/* // Let's try to get the children of the sketch.
let resp = engine
.send_modeling_cmd_get_response(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::EntityGetAllChildUuids { entity_id: sketch_id },
)
.await?;
let kittycad::types::OkWebSocketResponseData::Modeling {
modeling_response: kittycad::types::OkModelingCmdResponse::EntityGetAllChildUuids { data: children_info },
} = &resp
else {
return Err(KclError::Engine(KclErrorDetails {
message: format!("Get child info response was not as expected: {:?}", resp),
source_ranges: vec![SourceRange::default()],
}));
};
println!("children_info: {:#?}", children_info);
// Let's try to get the parent id.
let resp = engine
.send_modeling_cmd_get_response(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::EntityGetParentId { entity_id: sketch_id },
)
.await?;
let kittycad::types::OkWebSocketResponseData::Modeling {
modeling_response: kittycad::types::OkModelingCmdResponse::EntityGetParentId { data: parent_info },
} = &resp
else {
return Err(KclError::Engine(KclErrorDetails {
message: format!("Get parent id response was not as expected: {:?}", resp),
source_ranges: vec![SourceRange::default()],
}));
};
println!("parent_info: {:#?}", parent_info);*/
// Now let's get the control points for all the segments. // Now let's get the control points for all the segments.
// TODO: We should probably await all these at once so we aren't going one by one. // TODO: We should probably await all these at once so we aren't going one by one.
// But I guess this is fine for now. // But I guess this is fine for now.
@ -136,7 +95,7 @@ pub async fn modify_ast_for_sketch(
let mut control_points = Vec::new(); let mut control_points = Vec::new();
for segment in &path_info.segments { for segment in &path_info.segments {
if let Some(command_id) = &segment.command_id { if let Some(command_id) = &segment.command_id {
let h = engine.send_modeling_cmd_get_response( let h = engine.send_modeling_cmd(
uuid::Uuid::new_v4(), uuid::Uuid::new_v4(),
SourceRange::default(), SourceRange::default(),
ModelingCmd::CurveGetControlPoints { curve_id: *command_id }, ModelingCmd::CurveGetControlPoints { curve_id: *command_id },
@ -207,7 +166,7 @@ pub async fn modify_ast_for_sketch(
let recasted = program.recast(&FormatOptions::default(), 0); let recasted = program.recast(&FormatOptions::default(), 0);
// Re-parse the ast so we get the correct source ranges. // Re-parse the ast so we get the correct source ranges.
let tokens = crate::tokeniser::lexer(&recasted); let tokens = crate::token::lexer(&recasted);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
*program = parser.ast()?; *program = parser.ast()?;

View File

@ -571,11 +571,12 @@ impl BinaryPart {
} }
} }
pub fn get_result( #[async_recursion::async_recursion(?Send)]
pub async fn get_result(
&self, &self,
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo, pipe_info: &mut PipeInfo,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<MemoryItem, KclError> { ) -> Result<MemoryItem, KclError> {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would // We DO NOT set this gloablly because if we did and this was called inside a pipe it would
// stop the execution of the pipe. // stop the execution of the pipe.
@ -590,11 +591,13 @@ impl BinaryPart {
Ok(value.clone()) Ok(value.clone())
} }
BinaryPart::BinaryExpression(binary_expression) => { BinaryPart::BinaryExpression(binary_expression) => {
binary_expression.get_result(memory, &mut new_pipe_info, engine) binary_expression.get_result(memory, &mut new_pipe_info, engine).await
}
BinaryPart::CallExpression(call_expression) => {
call_expression.execute(memory, &mut new_pipe_info, engine).await
} }
BinaryPart::CallExpression(call_expression) => call_expression.execute(memory, &mut new_pipe_info, engine),
BinaryPart::UnaryExpression(unary_expression) => { BinaryPart::UnaryExpression(unary_expression) => {
unary_expression.get_result(memory, &mut new_pipe_info, engine) unary_expression.get_result(memory, &mut new_pipe_info, engine).await
} }
BinaryPart::MemberExpression(member_expression) => member_expression.get_result(memory), BinaryPart::MemberExpression(member_expression) => member_expression.get_result(memory),
} }
@ -810,11 +813,12 @@ impl CallExpression {
) )
} }
pub fn execute( #[async_recursion::async_recursion(?Send)]
pub async fn execute(
&self, &self,
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo, pipe_info: &mut PipeInfo,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<MemoryItem, KclError> { ) -> Result<MemoryItem, KclError> {
let fn_name = self.callee.name.clone(); let fn_name = self.callee.name.clone();
@ -828,7 +832,7 @@ impl CallExpression {
value.clone() value.clone()
} }
Value::BinaryExpression(binary_expression) => { Value::BinaryExpression(binary_expression) => {
binary_expression.get_result(memory, pipe_info, engine)? binary_expression.get_result(memory, pipe_info, engine).await?
} }
Value::CallExpression(call_expression) => { Value::CallExpression(call_expression) => {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would // We DO NOT set this gloablly because if we did and this was called inside a pipe it would
@ -836,11 +840,15 @@ impl CallExpression {
// THIS IS IMPORTANT. // THIS IS IMPORTANT.
let mut new_pipe_info = pipe_info.clone(); let mut new_pipe_info = pipe_info.clone();
new_pipe_info.is_in_pipe = false; new_pipe_info.is_in_pipe = false;
call_expression.execute(memory, &mut new_pipe_info, engine)? call_expression.execute(memory, &mut new_pipe_info, engine).await?
} }
Value::UnaryExpression(unary_expression) => unary_expression.get_result(memory, pipe_info, engine)?, Value::UnaryExpression(unary_expression) => {
Value::ObjectExpression(object_expression) => object_expression.execute(memory, pipe_info, engine)?, unary_expression.get_result(memory, pipe_info, engine).await?
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine)?, }
Value::ObjectExpression(object_expression) => {
object_expression.execute(memory, pipe_info, engine).await?
}
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine).await?,
Value::PipeExpression(pipe_expression) => { Value::PipeExpression(pipe_expression) => {
return Err(KclError::Semantic(KclErrorDetails { return Err(KclError::Semantic(KclErrorDetails {
message: format!("PipeExpression not implemented here: {:?}", pipe_expression), message: format!("PipeExpression not implemented here: {:?}", pipe_expression),
@ -871,31 +879,28 @@ impl CallExpression {
match &self.function { match &self.function {
Function::StdLib { func } => { Function::StdLib { func } => {
/* let source_range: SourceRange = self.into();
println!(
"Calling stdlib function: {}, source_range: {:?}, args: {:?}",
fn_name, source_range, fn_args
);*/
// Attempt to call the function. // Attempt to call the function.
let mut args = crate::std::Args::new(fn_args, self.into(), engine); let args = crate::std::Args::new(fn_args, self.into(), engine.clone());
let result = func.std_lib_fn()(&mut args)?; let result = func.std_lib_fn()(args).await?;
if pipe_info.is_in_pipe { if pipe_info.is_in_pipe {
pipe_info.index += 1; pipe_info.index += 1;
pipe_info.previous_results.push(result); pipe_info.previous_results.push(result);
execute_pipe_body(memory, &pipe_info.body.clone(), pipe_info, self.into(), engine) execute_pipe_body(memory, &pipe_info.body.clone(), pipe_info, self.into(), engine).await
} else { } else {
Ok(result) Ok(result)
} }
} }
Function::InMemory => { Function::InMemory => {
let mem = memory.clone(); let func = memory.get(&fn_name, self.into())?;
let func = mem.get(&fn_name, self.into())?; let result = func
let result = func.call_fn(&fn_args, &mem, engine)?.ok_or_else(|| { .call_fn(fn_args, memory.clone(), engine.clone())
KclError::UndefinedValue(KclErrorDetails { .await?
message: format!("Result of function {} is undefined", fn_name), .ok_or_else(|| {
source_ranges: vec![self.into()], KclError::UndefinedValue(KclErrorDetails {
}) message: format!("Result of function {} is undefined", fn_name),
})?; source_ranges: vec![self.into()],
})
})?;
let result = result.get_value()?; let result = result.get_value()?;
@ -903,7 +908,7 @@ impl CallExpression {
pipe_info.index += 1; pipe_info.index += 1;
pipe_info.previous_results.push(result); pipe_info.previous_results.push(result);
execute_pipe_body(memory, &pipe_info.body.clone(), pipe_info, self.into(), engine) execute_pipe_body(memory, &pipe_info.body.clone(), pipe_info, self.into(), engine).await
} else { } else {
Ok(result) Ok(result)
} }
@ -1424,11 +1429,12 @@ impl ArrayExpression {
None None
} }
pub fn execute( #[async_recursion::async_recursion(?Send)]
pub async fn execute(
&self, &self,
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo, pipe_info: &mut PipeInfo,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<MemoryItem, KclError> { ) -> Result<MemoryItem, KclError> {
let mut results = Vec::with_capacity(self.elements.len()); let mut results = Vec::with_capacity(self.elements.len());
@ -1440,7 +1446,7 @@ impl ArrayExpression {
value.clone() value.clone()
} }
Value::BinaryExpression(binary_expression) => { Value::BinaryExpression(binary_expression) => {
binary_expression.get_result(memory, pipe_info, engine)? binary_expression.get_result(memory, pipe_info, engine).await?
} }
Value::CallExpression(call_expression) => { Value::CallExpression(call_expression) => {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would // We DO NOT set this gloablly because if we did and this was called inside a pipe it would
@ -1448,12 +1454,16 @@ impl ArrayExpression {
// THIS IS IMPORTANT. // THIS IS IMPORTANT.
let mut new_pipe_info = pipe_info.clone(); let mut new_pipe_info = pipe_info.clone();
new_pipe_info.is_in_pipe = false; new_pipe_info.is_in_pipe = false;
call_expression.execute(memory, &mut new_pipe_info, engine)? call_expression.execute(memory, &mut new_pipe_info, engine).await?
} }
Value::UnaryExpression(unary_expression) => unary_expression.get_result(memory, pipe_info, engine)?, Value::UnaryExpression(unary_expression) => {
Value::ObjectExpression(object_expression) => object_expression.execute(memory, pipe_info, engine)?, unary_expression.get_result(memory, pipe_info, engine).await?
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine)?, }
Value::PipeExpression(pipe_expression) => pipe_expression.get_result(memory, pipe_info, engine)?, Value::ObjectExpression(object_expression) => {
object_expression.execute(memory, pipe_info, engine).await?
}
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine).await?,
Value::PipeExpression(pipe_expression) => pipe_expression.get_result(memory, pipe_info, engine).await?,
Value::PipeSubstitution(pipe_substitution) => { Value::PipeSubstitution(pipe_substitution) => {
return Err(KclError::Semantic(KclErrorDetails { return Err(KclError::Semantic(KclErrorDetails {
message: format!("PipeSubstitution not implemented here: {:?}", pipe_substitution), message: format!("PipeSubstitution not implemented here: {:?}", pipe_substitution),
@ -1569,11 +1579,12 @@ impl ObjectExpression {
None None
} }
pub fn execute( #[async_recursion::async_recursion(?Send)]
pub async fn execute(
&self, &self,
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo, pipe_info: &mut PipeInfo,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<MemoryItem, KclError> { ) -> Result<MemoryItem, KclError> {
let mut object = Map::new(); let mut object = Map::new();
for property in &self.properties { for property in &self.properties {
@ -1584,7 +1595,7 @@ impl ObjectExpression {
value.clone() value.clone()
} }
Value::BinaryExpression(binary_expression) => { Value::BinaryExpression(binary_expression) => {
binary_expression.get_result(memory, pipe_info, engine)? binary_expression.get_result(memory, pipe_info, engine).await?
} }
Value::CallExpression(call_expression) => { Value::CallExpression(call_expression) => {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would // We DO NOT set this gloablly because if we did and this was called inside a pipe it would
@ -1592,12 +1603,16 @@ impl ObjectExpression {
// THIS IS IMPORTANT. // THIS IS IMPORTANT.
let mut new_pipe_info = pipe_info.clone(); let mut new_pipe_info = pipe_info.clone();
new_pipe_info.is_in_pipe = false; new_pipe_info.is_in_pipe = false;
call_expression.execute(memory, &mut new_pipe_info, engine)? call_expression.execute(memory, &mut new_pipe_info, engine).await?
} }
Value::UnaryExpression(unary_expression) => unary_expression.get_result(memory, pipe_info, engine)?, Value::UnaryExpression(unary_expression) => {
Value::ObjectExpression(object_expression) => object_expression.execute(memory, pipe_info, engine)?, unary_expression.get_result(memory, pipe_info, engine).await?
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine)?, }
Value::PipeExpression(pipe_expression) => pipe_expression.get_result(memory, pipe_info, engine)?, Value::ObjectExpression(object_expression) => {
object_expression.execute(memory, pipe_info, engine).await?
}
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine).await?,
Value::PipeExpression(pipe_expression) => pipe_expression.get_result(memory, pipe_info, engine).await?,
Value::PipeSubstitution(pipe_substitution) => { Value::PipeSubstitution(pipe_substitution) => {
return Err(KclError::Semantic(KclErrorDetails { return Err(KclError::Semantic(KclErrorDetails {
message: format!("PipeSubstitution not implemented here: {:?}", pipe_substitution), message: format!("PipeSubstitution not implemented here: {:?}", pipe_substitution),
@ -2005,11 +2020,12 @@ impl BinaryExpression {
None None
} }
pub fn get_result( #[async_recursion::async_recursion(?Send)]
pub async fn get_result(
&self, &self,
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo, pipe_info: &mut PipeInfo,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<MemoryItem, KclError> { ) -> Result<MemoryItem, KclError> {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would // We DO NOT set this gloablly because if we did and this was called inside a pipe it would
// stop the execution of the pipe. // stop the execution of the pipe.
@ -2019,11 +2035,13 @@ impl BinaryExpression {
let left_json_value = self let left_json_value = self
.left .left
.get_result(memory, &mut new_pipe_info, engine)? .get_result(memory, &mut new_pipe_info, engine)
.await?
.get_json_value()?; .get_json_value()?;
let right_json_value = self let right_json_value = self
.right .right
.get_result(memory, &mut new_pipe_info, engine)? .get_result(memory, &mut new_pipe_info, engine)
.await?
.get_json_value()?; .get_json_value()?;
// First check if we are doing string concatenation. // First check if we are doing string concatenation.
@ -2173,11 +2191,11 @@ impl UnaryExpression {
format!("{}{}", &self.operator, self.argument.recast(options, 0)) format!("{}{}", &self.operator, self.argument.recast(options, 0))
} }
pub fn get_result( pub async fn get_result(
&self, &self,
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo, pipe_info: &mut PipeInfo,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<MemoryItem, KclError> { ) -> Result<MemoryItem, KclError> {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would // We DO NOT set this gloablly because if we did and this was called inside a pipe it would
// stop the execution of the pipe. // stop the execution of the pipe.
@ -2188,7 +2206,8 @@ impl UnaryExpression {
let num = parse_json_number_as_f64( let num = parse_json_number_as_f64(
&self &self
.argument .argument
.get_result(memory, &mut new_pipe_info, engine)? .get_result(memory, &mut new_pipe_info, engine)
.await?
.get_json_value()?, .get_json_value()?,
self.into(), self.into(),
)?; )?;
@ -2310,16 +2329,16 @@ impl PipeExpression {
None None
} }
pub fn get_result( pub async fn get_result(
&self, &self,
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo, pipe_info: &mut PipeInfo,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<MemoryItem, KclError> { ) -> Result<MemoryItem, KclError> {
// Reset the previous results. // Reset the previous results.
pipe_info.previous_results = vec![]; pipe_info.previous_results = vec![];
pipe_info.index = 0; pipe_info.index = 0;
execute_pipe_body(memory, &self.body, pipe_info, self.into(), engine) execute_pipe_body(memory, &self.body, pipe_info, self.into(), engine).await
} }
/// Rename all identifiers that have the old name to the new given name. /// Rename all identifiers that have the old name to the new given name.
@ -2330,12 +2349,13 @@ impl PipeExpression {
} }
} }
fn execute_pipe_body( #[async_recursion::async_recursion(?Send)]
async fn execute_pipe_body(
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
body: &[Value], body: &[Value],
pipe_info: &mut PipeInfo, pipe_info: &mut PipeInfo,
source_range: SourceRange, source_range: SourceRange,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<MemoryItem, KclError> { ) -> Result<MemoryItem, KclError> {
if pipe_info.index == body.len() { if pipe_info.index == body.len() {
pipe_info.is_in_pipe = false; pipe_info.is_in_pipe = false;
@ -2360,15 +2380,15 @@ fn execute_pipe_body(
match expression { match expression {
Value::BinaryExpression(binary_expression) => { Value::BinaryExpression(binary_expression) => {
let result = binary_expression.get_result(memory, pipe_info, engine)?; let result = binary_expression.get_result(memory, pipe_info, engine).await?;
pipe_info.previous_results.push(result); pipe_info.previous_results.push(result);
pipe_info.index += 1; pipe_info.index += 1;
execute_pipe_body(memory, body, pipe_info, source_range, engine) execute_pipe_body(memory, body, pipe_info, source_range, engine).await
} }
Value::CallExpression(call_expression) => { Value::CallExpression(call_expression) => {
pipe_info.is_in_pipe = true; pipe_info.is_in_pipe = true;
pipe_info.body = body.to_vec(); pipe_info.body = body.to_vec();
call_expression.execute(memory, pipe_info, engine) call_expression.execute(memory, pipe_info, engine).await
} }
_ => { _ => {
// Return an error this should not happen. // Return an error this should not happen.
@ -2671,7 +2691,7 @@ fn ghi = (x) => {
} }
show(part001)"#; show(part001)"#;
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
let symbols = program.get_lsp_symbols(code); let symbols = program.get_lsp_symbols(code);
@ -2699,7 +2719,7 @@ show(part001)
let some_program_string = r#"const part001 = startSketchAt([0.0, 5.0]) let some_program_string = r#"const part001 = startSketchAt([0.0, 5.0])
|> line([0.4900857016, -0.0240763666], %) |> line([0.4900857016, -0.0240763666], %)
|> line([0.6804562304, 0.9087880491], %)"#; |> line([0.6804562304, 0.9087880491], %)"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2718,7 +2738,7 @@ show(part001)
let some_program_string = r#"const part001 = startSketchAt([0.0, 5.0]) let some_program_string = r#"const part001 = startSketchAt([0.0, 5.0])
|> line([0.4900857016, -0.0240763666], %) // hello world |> line([0.4900857016, -0.0240763666], %) // hello world
|> line([0.6804562304, 0.9087880491], %)"#; |> line([0.6804562304, 0.9087880491], %)"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2737,7 +2757,7 @@ show(part001)
|> line([0.4900857016, -0.0240763666], %) |> line([0.4900857016, -0.0240763666], %)
// hello world // hello world
|> line([0.6804562304, 0.9087880491], %)"#; |> line([0.6804562304, 0.9087880491], %)"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2763,7 +2783,7 @@ show(part001)
// this is also a comment // this is also a comment
return things return things
}"#; }"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2800,7 +2820,7 @@ const mySk1 = startSketchAt([0, 0])
|> ry(45, %) |> ry(45, %)
|> rx(45, %) |> rx(45, %)
// one more for good measure"#; // one more for good measure"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2839,7 +2859,7 @@ a comment between pipe expression statements */
|> line([-0.42, -1.72], %) |> line([-0.42, -1.72], %)
show(part001)"#; show(part001)"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2865,7 +2885,7 @@ const yo = [
" hey oooooo really long long long" " hey oooooo really long long long"
] ]
"#; "#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2883,7 +2903,7 @@ const key = 'c'
const things = "things" const things = "things"
// this is also a comment"#; // this is also a comment"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2901,7 +2921,7 @@ const things = "things"
// a comment // a comment
" "
}"#; }"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2926,7 +2946,7 @@ const part001 = startSketchAt([0, 0])
-angleToMatchLengthY('seg01', myVar, %), -angleToMatchLengthY('seg01', myVar, %),
myVar myVar
], %) // ln-lineTo-yAbsolute should use angleToMatchLengthY helper"#; ], %) // ln-lineTo-yAbsolute should use angleToMatchLengthY helper"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2952,7 +2972,7 @@ const part001 = startSketchAt([0, 0])
myVar myVar
], %) // ln-lineTo-yAbsolute should use angleToMatchLengthY helper ], %) // ln-lineTo-yAbsolute should use angleToMatchLengthY helper
"#; "#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -2983,7 +3003,7 @@ fn ghi = (part001) => {
} }
show(part001)"#; show(part001)"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let mut program = parser.ast().unwrap(); let mut program = parser.ast().unwrap();
program.rename_symbol("mySuperCoolPart", 6); program.rename_symbol("mySuperCoolPart", 6);
@ -3014,7 +3034,7 @@ show(mySuperCoolPart)
let some_program_string = r#"fn ghi = (x, y, z) => { let some_program_string = r#"fn ghi = (x, y, z) => {
return x return x
}"#; }"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let mut program = parser.ast().unwrap(); let mut program = parser.ast().unwrap();
program.rename_symbol("newName", 10); program.rename_symbol("newName", 10);
@ -3043,7 +3063,7 @@ const firstExtrude = startSketchAt([0,0])
|> extrude(h, %) |> extrude(h, %)
show(firstExtrude)"#; show(firstExtrude)"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -3069,7 +3089,7 @@ show(firstExtrude)
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_recast_math_start_negative() { async fn test_recast_math_start_negative() {
let some_program_string = r#"const myVar = -5 + 6"#; let some_program_string = r#"const myVar = -5 + 6"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();
@ -3085,7 +3105,7 @@ const FOS = 2
const sigmaAllow = 8 const sigmaAllow = 8
const width = 20 const width = 20
const thickness = sqrt(distance * p * FOS * 6 / (sigmaAllow * width))"#; const thickness = sqrt(distance * p * FOS * 6 / (sigmaAllow * width))"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap(); let program = parser.ast().unwrap();

View File

@ -3,10 +3,11 @@
use std::sync::Arc; use std::sync::Arc;
use anyhow::Result; use anyhow::{anyhow, Result};
use dashmap::DashMap; use dashmap::DashMap;
use futures::{SinkExt, StreamExt}; use futures::{SinkExt, StreamExt};
use kittycad::types::{OkWebSocketResponseData, WebSocketRequest, WebSocketResponse}; use kittycad::types::{OkWebSocketResponseData, WebSocketRequest, WebSocketResponse};
use tokio::sync::{mpsc, oneshot};
use tokio_tungstenite::tungstenite::Message as WsMsg; use tokio_tungstenite::tungstenite::Message as WsMsg;
use crate::{ use crate::{
@ -14,18 +15,13 @@ use crate::{
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
}; };
#[derive(Debug)] type WebSocketTcpWrite = futures::stream::SplitSink<tokio_tungstenite::WebSocketStream<reqwest::Upgraded>, WsMsg>;
#[derive(Debug, Clone)]
#[allow(dead_code)] // for the TcpReadHandle
pub struct EngineConnection { pub struct EngineConnection {
tcp_write: futures::stream::SplitSink<tokio_tungstenite::WebSocketStream<reqwest::Upgraded>, WsMsg>, engine_req_tx: mpsc::Sender<ToEngineReq>,
tcp_read_handle: tokio::task::JoinHandle<Result<()>>,
responses: Arc<DashMap<uuid::Uuid, WebSocketResponse>>, responses: Arc<DashMap<uuid::Uuid, WebSocketResponse>>,
} tcp_read_handle: Arc<TcpReadHandle>,
impl Drop for EngineConnection {
fn drop(&mut self) {
// Drop the read handle.
self.tcp_read_handle.abort();
}
} }
pub struct TcpRead { pub struct TcpRead {
@ -46,16 +42,63 @@ impl TcpRead {
} }
} }
#[derive(Debug)]
pub struct TcpReadHandle {
handle: Arc<tokio::task::JoinHandle<Result<()>>>,
}
impl Drop for TcpReadHandle {
fn drop(&mut self) {
// Drop the read handle.
self.handle.abort();
}
}
/// Requests to send to the engine, and a way to await a response.
struct ToEngineReq {
/// The request to send
req: WebSocketRequest,
/// If this resolves to Ok, the request was sent.
/// If this resolves to Err, the request could not be sent.
/// If this has not yet resolved, the request has not been sent yet.
request_sent: oneshot::Sender<Result<()>>,
}
impl EngineConnection { impl EngineConnection {
/// Start waiting for incoming engine requests, and send each one over the WebSocket to the engine.
async fn start_write_actor(mut tcp_write: WebSocketTcpWrite, mut engine_req_rx: mpsc::Receiver<ToEngineReq>) {
while let Some(req) = engine_req_rx.recv().await {
let ToEngineReq { req, request_sent } = req;
let res = Self::inner_send_to_engine(req, &mut tcp_write).await;
let _ = request_sent.send(res);
}
}
/// Send the given `request` to the engine via the WebSocket connection `tcp_write`.
async fn inner_send_to_engine(request: WebSocketRequest, tcp_write: &mut WebSocketTcpWrite) -> Result<()> {
let msg = serde_json::to_string(&request).map_err(|e| anyhow!("could not serialize json: {e}"))?;
tcp_write
.send(WsMsg::Text(msg))
.await
.map_err(|e| anyhow!("could not send json over websocket: {e}"))?;
Ok(())
}
pub async fn new(ws: reqwest::Upgraded) -> Result<EngineConnection> { pub async fn new(ws: reqwest::Upgraded) -> Result<EngineConnection> {
let ws_stream = tokio_tungstenite::WebSocketStream::from_raw_socket( let ws_stream = tokio_tungstenite::WebSocketStream::from_raw_socket(
ws, ws,
tokio_tungstenite::tungstenite::protocol::Role::Client, tokio_tungstenite::tungstenite::protocol::Role::Client,
None, Some(tokio_tungstenite::tungstenite::protocol::WebSocketConfig {
write_buffer_size: 1024 * 128,
max_write_buffer_size: 1024 * 256,
..Default::default()
}),
) )
.await; .await;
let (tcp_write, tcp_read) = ws_stream.split(); let (tcp_write, tcp_read) = ws_stream.split();
let (engine_req_tx, engine_req_rx) = mpsc::channel(10);
tokio::task::spawn(Self::start_write_actor(tcp_write, engine_req_rx));
let mut tcp_read = TcpRead { stream: tcp_read }; let mut tcp_read = TcpRead { stream: tcp_read };
@ -80,42 +123,34 @@ impl EngineConnection {
}); });
Ok(EngineConnection { Ok(EngineConnection {
tcp_write, engine_req_tx,
tcp_read_handle, tcp_read_handle: Arc::new(TcpReadHandle {
handle: Arc::new(tcp_read_handle),
}),
responses, responses,
}) })
} }
pub async fn tcp_send(&mut self, msg: WebSocketRequest) -> Result<()> {
let msg = serde_json::to_string(&msg)?;
self.tcp_write.send(WsMsg::Text(msg)).await?;
Ok(())
}
} }
#[async_trait::async_trait(?Send)] #[async_trait::async_trait(?Send)]
impl EngineManager for EngineConnection { impl EngineManager for EngineConnection {
/// Send a modeling command. async fn send_modeling_cmd(
/// Do not wait for the response message. &self,
fn send_modeling_cmd(
&mut self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,
) -> Result<(), KclError> {
futures::executor::block_on(self.send_modeling_cmd_get_response(id, source_range, cmd))?;
Ok(())
}
/// Send a modeling command and wait for the response message.
async fn send_modeling_cmd_get_response(
&mut self,
id: uuid::Uuid, id: uuid::Uuid,
source_range: crate::executor::SourceRange, source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd, cmd: kittycad::types::ModelingCmd,
) -> Result<OkWebSocketResponseData, KclError> { ) -> Result<OkWebSocketResponseData, KclError> {
self.tcp_send(WebSocketRequest::ModelingCmdReq { cmd, cmd_id: id }) let (tx, rx) = oneshot::channel();
// Send the request to the engine, via the actor.
self.engine_req_tx
.send(ToEngineReq {
req: WebSocketRequest::ModelingCmdReq {
cmd: cmd.clone(),
cmd_id: id,
},
request_sent: tx,
})
.await .await
.map_err(|e| { .map_err(|e| {
KclError::Engine(KclErrorDetails { KclError::Engine(KclErrorDetails {
@ -124,18 +159,40 @@ impl EngineManager for EngineConnection {
}) })
})?; })?;
// Wait for the request to be sent.
rx.await
.map_err(|e| {
KclError::Engine(KclErrorDetails {
message: format!("could not send request to the engine actor: {e}"),
source_ranges: vec![source_range],
})
})?
.map_err(|e| {
KclError::Engine(KclErrorDetails {
message: format!("could not send request to the engine: {e}"),
source_ranges: vec![source_range],
})
})?;
// Wait for the response. // Wait for the response.
loop { let current_time = std::time::Instant::now();
if let Some(resp) = self.responses.get(&id) { while current_time.elapsed().as_secs() < 60 {
if let Some(data) = &resp.resp { // We pop off the responses to cleanup our mappings.
return Ok(data.clone()); if let Some((_, resp)) = self.responses.remove(&id) {
return if let Some(data) = &resp.resp {
Ok(data.clone())
} else { } else {
return Err(KclError::Engine(KclErrorDetails { Err(KclError::Engine(KclErrorDetails {
message: format!("Modeling command failed: {:?}", resp.errors), message: format!("Modeling command failed: {:?}", resp.errors),
source_ranges: vec![source_range], source_ranges: vec![source_range],
})); }))
} };
} }
} }
Err(KclError::Engine(KclErrorDetails {
message: format!("Modeling command timed out `{}`: {:?}", id, cmd),
source_ranges: vec![source_range],
}))
} }
} }

View File

@ -6,7 +6,7 @@ use kittycad::types::OkWebSocketResponseData;
use crate::errors::KclError; use crate::errors::KclError;
#[derive(Debug)] #[derive(Debug, Clone)]
pub struct EngineConnection {} pub struct EngineConnection {}
impl EngineConnection { impl EngineConnection {
@ -17,21 +17,14 @@ impl EngineConnection {
#[async_trait::async_trait(?Send)] #[async_trait::async_trait(?Send)]
impl crate::engine::EngineManager for EngineConnection { impl crate::engine::EngineManager for EngineConnection {
fn send_modeling_cmd( async fn send_modeling_cmd(
&mut self, &self,
_id: uuid::Uuid,
_source_range: crate::executor::SourceRange,
_cmd: kittycad::types::ModelingCmd,
) -> Result<(), KclError> {
Ok(())
}
async fn send_modeling_cmd_get_response(
&mut self,
_id: uuid::Uuid, _id: uuid::Uuid,
_source_range: crate::executor::SourceRange, _source_range: crate::executor::SourceRange,
_cmd: kittycad::types::ModelingCmd, _cmd: kittycad::types::ModelingCmd,
) -> Result<OkWebSocketResponseData, KclError> { ) -> Result<OkWebSocketResponseData, KclError> {
todo!() Ok(OkWebSocketResponseData::Modeling {
modeling_response: kittycad::types::OkModelingCmdResponse::Empty {},
})
} }
} }

View File

@ -1,5 +1,6 @@
//! Functions for setting up our WebSocket and WebRTC connections for communications with the //! Functions for setting up our WebSocket and WebRTC connections for communications with the
//! engine. //! engine.
use std::sync::Arc;
use anyhow::Result; use anyhow::Result;
use kittycad::types::WebSocketRequest; use kittycad::types::WebSocketRequest;
@ -23,44 +24,21 @@ extern "C" {
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct EngineConnection { pub struct EngineConnection {
manager: EngineCommandManager, manager: Arc<EngineCommandManager>,
} }
impl EngineConnection { impl EngineConnection {
pub async fn new(manager: EngineCommandManager) -> Result<EngineConnection, JsValue> { pub async fn new(manager: EngineCommandManager) -> Result<EngineConnection, JsValue> {
Ok(EngineConnection { manager }) Ok(EngineConnection {
manager: Arc::new(manager),
})
} }
} }
#[async_trait::async_trait(?Send)] #[async_trait::async_trait(?Send)]
impl crate::engine::EngineManager for EngineConnection { impl crate::engine::EngineManager for EngineConnection {
fn send_modeling_cmd( async fn send_modeling_cmd(
&mut self, &self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,
) -> Result<(), KclError> {
let source_range_str = serde_json::to_string(&source_range).map_err(|e| {
KclError::Engine(KclErrorDetails {
message: format!("Failed to serialize source range: {:?}", e),
source_ranges: vec![source_range],
})
})?;
let ws_msg = WebSocketRequest::ModelingCmdReq { cmd, cmd_id: id };
let cmd_str = serde_json::to_string(&ws_msg).map_err(|e| {
KclError::Engine(KclErrorDetails {
message: format!("Failed to serialize modeling command: {:?}", e),
source_ranges: vec![source_range],
})
})?;
let _ = self
.manager
.sendModelingCommandFromWasm(id.to_string(), source_range_str, cmd_str);
Ok(())
}
async fn send_modeling_cmd_get_response(
&mut self,
id: uuid::Uuid, id: uuid::Uuid,
source_range: crate::executor::SourceRange, source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd, cmd: kittycad::types::ModelingCmd,

View File

@ -32,19 +32,10 @@ use anyhow::Result;
pub use conn_mock::EngineConnection; pub use conn_mock::EngineConnection;
#[async_trait::async_trait(?Send)] #[async_trait::async_trait(?Send)]
pub trait EngineManager { pub trait EngineManager: Clone {
/// Send a modeling command.
/// Do not wait for the response message.
fn send_modeling_cmd(
&mut self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,
) -> Result<(), crate::errors::KclError>;
/// Send a modeling command and wait for the response message. /// Send a modeling command and wait for the response message.
async fn send_modeling_cmd_get_response( async fn send_modeling_cmd(
&mut self, &self,
id: uuid::Uuid, id: uuid::Uuid,
source_range: crate::executor::SourceRange, source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd, cmd: kittycad::types::ModelingCmd,

View File

@ -104,7 +104,7 @@ pub enum MemoryItem {
SketchGroup(Box<SketchGroup>), SketchGroup(Box<SketchGroup>),
ExtrudeGroup(Box<ExtrudeGroup>), ExtrudeGroup(Box<ExtrudeGroup>),
#[ts(skip)] #[ts(skip)]
ExtrudeTransform(ExtrudeTransform), ExtrudeTransform(Box<ExtrudeTransform>),
#[ts(skip)] #[ts(skip)]
Function { Function {
#[serde(skip)] #[serde(skip)]
@ -134,13 +134,28 @@ pub struct ExtrudeTransform {
pub meta: Vec<Metadata>, pub meta: Vec<Metadata>,
} }
pub type MemoryFunction = fn( pub type MemoryFunction =
s: &[MemoryItem], fn(
memory: &ProgramMemory, s: Vec<MemoryItem>,
expression: &FunctionExpression, memory: ProgramMemory,
metadata: &[Metadata], expression: Box<FunctionExpression>,
engine: &mut EngineConnection, metadata: Vec<Metadata>,
) -> Result<Option<ProgramReturn>, KclError>; engine: EngineConnection,
) -> std::pin::Pin<Box<dyn std::future::Future<Output = Result<Option<ProgramReturn>, KclError>>>>;
fn force_memory_function<
F: Fn(
Vec<MemoryItem>,
ProgramMemory,
Box<FunctionExpression>,
Vec<Metadata>,
EngineConnection,
) -> std::pin::Pin<Box<dyn std::future::Future<Output = Result<Option<ProgramReturn>, KclError>>>>,
>(
f: F,
) -> F {
f
}
impl From<MemoryItem> for Vec<SourceRange> { impl From<MemoryItem> for Vec<SourceRange> {
fn from(item: MemoryItem) -> Self { fn from(item: MemoryItem) -> Self {
@ -168,24 +183,24 @@ impl MemoryItem {
} }
} }
pub fn call_fn( pub async fn call_fn(
&self, &self,
args: &[MemoryItem], args: Vec<MemoryItem>,
memory: &ProgramMemory, memory: ProgramMemory,
engine: &mut EngineConnection, engine: EngineConnection,
) -> Result<Option<ProgramReturn>, KclError> { ) -> Result<Option<ProgramReturn>, KclError> {
if let MemoryItem::Function { func, expression, meta } = self { if let MemoryItem::Function { func, expression, meta } = &self {
if let Some(func) = func { if let Some(func) = func {
func(args, memory, expression, meta, engine) func(args, memory, expression.clone(), meta.clone(), engine).await
} else { } else {
Err(KclError::Semantic(KclErrorDetails { Err(KclError::Semantic(KclErrorDetails {
message: format!("Not a function: {:?}", self), message: format!("Not a function: {:?}", expression),
source_ranges: vec![], source_ranges: vec![],
})) }))
} }
} else { } else {
Err(KclError::Semantic(KclErrorDetails { Err(KclError::Semantic(KclErrorDetails {
message: format!("not a function: {:?}", self), message: "not a in memory function".to_string(),
source_ranges: vec![], source_ranges: vec![],
})) }))
} }
@ -579,11 +594,11 @@ impl Default for PipeInfo {
} }
/// Execute a AST's program. /// Execute a AST's program.
pub fn execute( pub async fn execute(
program: crate::ast::types::Program, program: crate::ast::types::Program,
memory: &mut ProgramMemory, memory: &mut ProgramMemory,
options: BodyType, options: BodyType,
engine: &mut EngineConnection, engine: &EngineConnection,
) -> Result<ProgramMemory, KclError> { ) -> Result<ProgramMemory, KclError> {
let mut pipe_info = PipeInfo::default(); let mut pipe_info = PipeInfo::default();
@ -602,7 +617,23 @@ pub fn execute(
args.push(memory_item.clone()); args.push(memory_item.clone());
} }
Value::CallExpression(call_expr) => { Value::CallExpression(call_expr) => {
let result = call_expr.execute(memory, &mut pipe_info, engine)?; let result = call_expr.execute(memory, &mut pipe_info, engine).await?;
args.push(result);
}
Value::BinaryExpression(binary_expression) => {
let result = binary_expression.get_result(memory, &mut pipe_info, engine).await?;
args.push(result);
}
Value::UnaryExpression(unary_expression) => {
let result = unary_expression.get_result(memory, &mut pipe_info, engine).await?;
args.push(result);
}
Value::ObjectExpression(object_expression) => {
let result = object_expression.execute(memory, &mut pipe_info, engine).await?;
args.push(result);
}
Value::ArrayExpression(array_expression) => {
let result = array_expression.execute(memory, &mut pipe_info, engine).await?;
args.push(result); args.push(result);
} }
// We do nothing for the rest. // We do nothing for the rest.
@ -620,7 +651,7 @@ pub fn execute(
memory.return_ = Some(ProgramReturn::Arguments(call_expr.arguments.clone())); memory.return_ = Some(ProgramReturn::Arguments(call_expr.arguments.clone()));
} else if let Some(func) = memory.clone().root.get(&fn_name) { } else if let Some(func) = memory.clone().root.get(&fn_name) {
let result = func.call_fn(&args, memory, engine)?; let result = func.call_fn(args.clone(), memory.clone(), engine.clone()).await?;
memory.return_ = result; memory.return_ = result;
} else { } else {
@ -646,22 +677,27 @@ pub fn execute(
memory.add(&var_name, value.clone(), source_range)?; memory.add(&var_name, value.clone(), source_range)?;
} }
Value::BinaryExpression(binary_expression) => { Value::BinaryExpression(binary_expression) => {
let result = binary_expression.get_result(memory, &mut pipe_info, engine)?; let result = binary_expression.get_result(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?; memory.add(&var_name, result, source_range)?;
} }
Value::FunctionExpression(function_expression) => { Value::FunctionExpression(function_expression) => {
memory.add( let mem_func = force_memory_function(
&var_name, |args: Vec<MemoryItem>,
MemoryItem::Function{ memory: ProgramMemory,
expression: function_expression.clone(), function_expression: Box<FunctionExpression>,
meta: vec![metadata], _metadata: Vec<Metadata>,
func: Some(|args: &[MemoryItem], memory: &ProgramMemory, function_expression: &FunctionExpression, _metadata: &[Metadata], engine: &mut EngineConnection| -> Result<Option<ProgramReturn>, KclError> { engine: EngineConnection| {
Box::pin(async move {
let mut fn_memory = memory.clone(); let mut fn_memory = memory.clone();
if args.len() != function_expression.params.len() { if args.len() != function_expression.params.len() {
return Err(KclError::Semantic(KclErrorDetails { return Err(KclError::Semantic(KclErrorDetails {
message: format!("Expected {} arguments, got {}", function_expression.params.len(), args.len()), message: format!(
source_ranges: vec![function_expression.into()], "Expected {} arguments, got {}",
function_expression.params.len(),
args.len(),
),
source_ranges: vec![(&function_expression).into()],
})); }));
} }
@ -674,20 +710,34 @@ pub fn execute(
)?; )?;
} }
let result = execute(function_expression.body.clone(), &mut fn_memory, BodyType::Block, engine)?; let result = execute(
function_expression.body.clone(),
&mut fn_memory,
BodyType::Block,
&engine,
)
.await?;
Ok(result.return_) Ok(result.return_)
}) })
}, },
);
memory.add(
&var_name,
MemoryItem::Function {
expression: function_expression.clone(),
meta: vec![metadata],
func: Some(mem_func),
},
source_range, source_range,
)?; )?;
} }
Value::CallExpression(call_expression) => { Value::CallExpression(call_expression) => {
let result = call_expression.execute(memory, &mut pipe_info, engine)?; let result = call_expression.execute(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?; memory.add(&var_name, result, source_range)?;
} }
Value::PipeExpression(pipe_expression) => { Value::PipeExpression(pipe_expression) => {
let result = pipe_expression.get_result(memory, &mut pipe_info, engine)?; let result = pipe_expression.get_result(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?; memory.add(&var_name, result, source_range)?;
} }
Value::PipeSubstitution(pipe_substitution) => { Value::PipeSubstitution(pipe_substitution) => {
@ -700,11 +750,11 @@ pub fn execute(
})); }));
} }
Value::ArrayExpression(array_expression) => { Value::ArrayExpression(array_expression) => {
let result = array_expression.execute(memory, &mut pipe_info, engine)?; let result = array_expression.execute(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?; memory.add(&var_name, result, source_range)?;
} }
Value::ObjectExpression(object_expression) => { Value::ObjectExpression(object_expression) => {
let result = object_expression.execute(memory, &mut pipe_info, engine)?; let result = object_expression.execute(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?; memory.add(&var_name, result, source_range)?;
} }
Value::MemberExpression(member_expression) => { Value::MemberExpression(member_expression) => {
@ -712,7 +762,7 @@ pub fn execute(
memory.add(&var_name, result, source_range)?; memory.add(&var_name, result, source_range)?;
} }
Value::UnaryExpression(unary_expression) => { Value::UnaryExpression(unary_expression) => {
let result = unary_expression.get_result(memory, &mut pipe_info, engine)?; let result = unary_expression.get_result(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?; memory.add(&var_name, result, source_range)?;
} }
} }
@ -720,11 +770,11 @@ pub fn execute(
} }
BodyItem::ReturnStatement(return_statement) => match &return_statement.argument { BodyItem::ReturnStatement(return_statement) => match &return_statement.argument {
Value::BinaryExpression(bin_expr) => { Value::BinaryExpression(bin_expr) => {
let result = bin_expr.get_result(memory, &mut pipe_info, engine)?; let result = bin_expr.get_result(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result)); memory.return_ = Some(ProgramReturn::Value(result));
} }
Value::UnaryExpression(unary_expr) => { Value::UnaryExpression(unary_expr) => {
let result = unary_expr.get_result(memory, &mut pipe_info, engine)?; let result = unary_expr.get_result(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result)); memory.return_ = Some(ProgramReturn::Value(result));
} }
Value::Identifier(identifier) => { Value::Identifier(identifier) => {
@ -735,15 +785,15 @@ pub fn execute(
memory.return_ = Some(ProgramReturn::Value(literal.into())); memory.return_ = Some(ProgramReturn::Value(literal.into()));
} }
Value::ArrayExpression(array_expr) => { Value::ArrayExpression(array_expr) => {
let result = array_expr.execute(memory, &mut pipe_info, engine)?; let result = array_expr.execute(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result)); memory.return_ = Some(ProgramReturn::Value(result));
} }
Value::ObjectExpression(obj_expr) => { Value::ObjectExpression(obj_expr) => {
let result = obj_expr.execute(memory, &mut pipe_info, engine)?; let result = obj_expr.execute(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result)); memory.return_ = Some(ProgramReturn::Value(result));
} }
Value::CallExpression(call_expr) => { Value::CallExpression(call_expr) => {
let result = call_expr.execute(memory, &mut pipe_info, engine)?; let result = call_expr.execute(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result)); memory.return_ = Some(ProgramReturn::Value(result));
} }
Value::MemberExpression(member_expr) => { Value::MemberExpression(member_expr) => {
@ -751,7 +801,7 @@ pub fn execute(
memory.return_ = Some(ProgramReturn::Value(result)); memory.return_ = Some(ProgramReturn::Value(result));
} }
Value::PipeExpression(pipe_expr) => { Value::PipeExpression(pipe_expr) => {
let result = pipe_expr.get_result(memory, &mut pipe_info, engine)?; let result = pipe_expr.get_result(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result)); memory.return_ = Some(ProgramReturn::Value(result));
} }
Value::PipeSubstitution(_) => {} Value::PipeSubstitution(_) => {}
@ -770,12 +820,12 @@ mod tests {
use super::*; use super::*;
pub async fn parse_execute(code: &str) -> Result<ProgramMemory> { pub async fn parse_execute(code: &str) -> Result<ProgramMemory> {
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let program = parser.ast()?; let program = parser.ast()?;
let mut mem: ProgramMemory = Default::default(); let mut mem: ProgramMemory = Default::default();
let mut engine = EngineConnection::new().await?; let engine = EngineConnection::new().await?;
let memory = execute(program, &mut mem, BodyType::Root, &mut engine)?; let memory = execute(program, &mut mem, BodyType::Root, &engine).await?;
Ok(memory) Ok(memory)
} }

View File

@ -9,4 +9,4 @@ pub mod math_parser;
pub mod parser; pub mod parser;
pub mod server; pub mod server;
pub mod std; pub mod std;
pub mod tokeniser; pub mod token;

View File

@ -10,8 +10,8 @@ use crate::{
}, },
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
executor::SourceRange, executor::SourceRange,
parser::{is_not_code_token, Parser}, parser::Parser,
tokeniser::{Token, TokenType}, token::{Token, TokenType},
}; };
#[derive(Debug, PartialEq, Eq, Deserialize, Serialize, Clone, ts_rs::TS)] #[derive(Debug, PartialEq, Eq, Deserialize, Serialize, Clone, ts_rs::TS)]
@ -334,7 +334,7 @@ impl ReversePolishNotation {
return rpn.parse(); return rpn.parse();
} }
if is_not_code_token(current_token) { if !current_token.is_code_token() {
let rpn = ReversePolishNotation::new(&self.parser.tokens[1..], &self.previous_postfix, &self.operators); let rpn = ReversePolishNotation::new(&self.parser.tokens[1..], &self.previous_postfix, &self.operators);
return rpn.parse(); return rpn.parse();
} }
@ -704,7 +704,7 @@ mod test {
#[test] #[test]
fn test_parse_expression() { fn test_parse_expression() {
let tokens = crate::tokeniser::lexer("1 + 2"); let tokens = crate::token::lexer("1 + 2");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -731,7 +731,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_add_no_spaces() { fn test_parse_expression_add_no_spaces() {
let tokens = crate::tokeniser::lexer("1+2"); let tokens = crate::token::lexer("1+2");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -758,7 +758,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_sub_no_spaces() { fn test_parse_expression_sub_no_spaces() {
let tokens = crate::tokeniser::lexer("1 -2"); let tokens = crate::token::lexer("1 -2");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -785,7 +785,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_plus_followed_by_star() { fn test_parse_expression_plus_followed_by_star() {
let tokens = crate::tokeniser::lexer("1 + 2 * 3"); let tokens = crate::token::lexer("1 + 2 * 3");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -823,7 +823,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_with_parentheses() { fn test_parse_expression_with_parentheses() {
let tokens = crate::tokeniser::lexer("1 * ( 2 + 3 )"); let tokens = crate::token::lexer("1 * ( 2 + 3 )");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -861,7 +861,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_parens_in_middle() { fn test_parse_expression_parens_in_middle() {
let tokens = crate::tokeniser::lexer("1 * ( 2 + 3 ) / 4"); let tokens = crate::token::lexer("1 * ( 2 + 3 ) / 4");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -910,7 +910,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_parans_and_predence() { fn test_parse_expression_parans_and_predence() {
let tokens = crate::tokeniser::lexer("1 + ( 2 + 3 ) / 4"); let tokens = crate::token::lexer("1 + ( 2 + 3 ) / 4");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -958,7 +958,7 @@ mod test {
} }
#[test] #[test]
fn test_parse_expression_nested() { fn test_parse_expression_nested() {
let tokens = crate::tokeniser::lexer("1 * (( 2 + 3 ) / 4 + 5 )"); let tokens = crate::token::lexer("1 * (( 2 + 3 ) / 4 + 5 )");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -1017,7 +1017,7 @@ mod test {
} }
#[test] #[test]
fn test_parse_expression_redundant_braces() { fn test_parse_expression_redundant_braces() {
let tokens = crate::tokeniser::lexer("1 * ((( 2 + 3 )))"); let tokens = crate::token::lexer("1 * ((( 2 + 3 )))");
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -1055,7 +1055,7 @@ mod test {
#[test] #[test]
fn test_reverse_polish_notation_simple() { fn test_reverse_polish_notation_simple() {
let parser = ReversePolishNotation::new(&crate::tokeniser::lexer("1 + 2"), &[], &[]); let parser = ReversePolishNotation::new(&crate::token::lexer("1 + 2"), &[], &[]);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
result, result,
@ -1084,7 +1084,7 @@ mod test {
#[test] #[test]
fn test_reverse_polish_notation_complex() { fn test_reverse_polish_notation_complex() {
let parser = ReversePolishNotation::new(&crate::tokeniser::lexer("1 + 2 * 3"), &[], &[]); let parser = ReversePolishNotation::new(&crate::token::lexer("1 + 2 * 3"), &[], &[]);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
result, result,
@ -1125,7 +1125,7 @@ mod test {
#[test] #[test]
fn test_reverse_polish_notation_complex_with_parentheses() { fn test_reverse_polish_notation_complex_with_parentheses() {
let parser = ReversePolishNotation::new(&crate::tokeniser::lexer("1 * ( 2 + 3 )"), &[], &[]); let parser = ReversePolishNotation::new(&crate::token::lexer("1 * ( 2 + 3 )"), &[], &[]);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
result, result,
@ -1179,7 +1179,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_redundant_braces_around_literal() { fn test_parse_expression_redundant_braces_around_literal() {
let code = "2 + (((3)))"; let code = "2 + (((3)))";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap(); let result = parser.parse().unwrap();
assert_eq!( assert_eq!(
@ -1274,7 +1274,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_braces_around_lots_of_math() { fn test_parse_expression_braces_around_lots_of_math() {
let code = "(distance * p * FOS * 6 / (sigmaAllow * width))"; let code = "(distance * p * FOS * 6 / (sigmaAllow * width))";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse(); let result = parser.parse();
assert!(result.is_ok()); assert!(result.is_ok());
@ -1283,7 +1283,7 @@ mod test {
#[test] #[test]
fn test_parse_expression_braces_around_internals_lots_of_math() { fn test_parse_expression_braces_around_internals_lots_of_math() {
let code = "distance * p * FOS * 6 / (sigmaAllow * width)"; let code = "distance * p * FOS * 6 / (sigmaAllow * width)";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let mut parser = MathParser::new(&tokens); let mut parser = MathParser::new(&tokens);
let result = parser.parse(); let result = parser.parse();
assert!(result.is_ok()); assert!(result.is_ok());

View File

@ -10,7 +10,7 @@ use crate::{
}, },
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
math_parser::MathParser, math_parser::MathParser,
tokeniser::{Token, TokenType}, token::{Token, TokenType},
}; };
pub const PIPE_SUBSTITUTION_OPERATOR: &str = "%"; pub const PIPE_SUBSTITUTION_OPERATOR: &str = "%";
@ -249,7 +249,7 @@ impl Parser {
} }
let current_token = self.get_token(index)?; let current_token = self.get_token(index)?;
if is_not_code_token(current_token) { if !current_token.is_code_token() {
return self.find_end_of_non_code_node(index + 1); return self.find_end_of_non_code_node(index + 1);
} }
@ -262,7 +262,7 @@ impl Parser {
} }
let current_token = self.get_token(index)?; let current_token = self.get_token(index)?;
if is_not_code_token(current_token) { if !current_token.is_code_token() {
return self.find_start_of_non_code_node(index - 1); return self.find_start_of_non_code_node(index - 1);
} }
@ -365,7 +365,7 @@ impl Parser {
}); });
}; };
if is_not_code_token(token) { if !token.is_code_token() {
let non_code_node = self.make_non_code_node(new_index)?; let non_code_node = self.make_non_code_node(new_index)?;
let new_new_index = non_code_node.1 + 1; let new_new_index = non_code_node.1 + 1;
let bonus_non_code_node = non_code_node.0; let bonus_non_code_node = non_code_node.0;
@ -1623,7 +1623,7 @@ impl Parser {
}); });
} }
if is_not_code_token(token) { if !token.is_code_token() {
let next_token = self.next_meaningful_token(token_index, Some(0))?; let next_token = self.next_meaningful_token(token_index, Some(0))?;
if let Some(node) = &next_token.non_code_node { if let Some(node) = &next_token.non_code_node {
if previous_body.is_empty() { if previous_body.is_empty() {
@ -1788,12 +1788,6 @@ impl Parser {
} }
} }
pub fn is_not_code_token(token: &Token) -> bool {
token.token_type == TokenType::Whitespace
|| token.token_type == TokenType::LineComment
|| token.token_type == TokenType::BlockComment
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
@ -1803,7 +1797,7 @@ mod tests {
#[test] #[test]
fn test_make_identifier() { fn test_make_identifier() {
let tokens = crate::tokeniser::lexer("a"); let tokens = crate::token::lexer("a");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let identifier = parser.make_identifier(0).unwrap(); let identifier = parser.make_identifier(0).unwrap();
assert_eq!( assert_eq!(
@ -1818,7 +1812,7 @@ mod tests {
#[test] #[test]
fn test_make_identifier_with_const_myvar_equals_5_and_index_2() { fn test_make_identifier_with_const_myvar_equals_5_and_index_2() {
let tokens = crate::tokeniser::lexer("const myVar = 5"); let tokens = crate::token::lexer("const myVar = 5");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let identifier = parser.make_identifier(2).unwrap(); let identifier = parser.make_identifier(2).unwrap();
assert_eq!( assert_eq!(
@ -1833,7 +1827,7 @@ mod tests {
#[test] #[test]
fn test_make_identifier_multiline() { fn test_make_identifier_multiline() {
let tokens = crate::tokeniser::lexer("const myVar = 5\nconst newVar = myVar + 1"); let tokens = crate::token::lexer("const myVar = 5\nconst newVar = myVar + 1");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let identifier = parser.make_identifier(2).unwrap(); let identifier = parser.make_identifier(2).unwrap();
assert_eq!( assert_eq!(
@ -1857,7 +1851,7 @@ mod tests {
#[test] #[test]
fn test_make_identifier_call_expression() { fn test_make_identifier_call_expression() {
let tokens = crate::tokeniser::lexer("log(5, \"hello\", aIdentifier)"); let tokens = crate::token::lexer("log(5, \"hello\", aIdentifier)");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let identifier = parser.make_identifier(0).unwrap(); let identifier = parser.make_identifier(0).unwrap();
assert_eq!( assert_eq!(
@ -1880,7 +1874,7 @@ mod tests {
} }
#[test] #[test]
fn test_make_non_code_node() { fn test_make_non_code_node() {
let tokens = crate::tokeniser::lexer("log(5, \"hello\", aIdentifier)"); let tokens = crate::token::lexer("log(5, \"hello\", aIdentifier)");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let index = 4; let index = 4;
let expected_output = (None, 4); let expected_output = (None, 4);
@ -1889,7 +1883,7 @@ mod tests {
let index = 7; let index = 7;
let expected_output = (None, 7); let expected_output = (None, 7);
assert_eq!(parser.make_non_code_node(index).unwrap(), expected_output); assert_eq!(parser.make_non_code_node(index).unwrap(), expected_output);
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#" r#"
const yo = { a: { b: { c: '123' } } } const yo = { a: { b: { c: '123' } } }
// this is a comment // this is a comment
@ -1920,7 +1914,7 @@ const key = 'c'"#,
31, 31,
); );
assert_eq!(parser.make_non_code_node(index).unwrap(), expected_output); assert_eq!(parser.make_non_code_node(index).unwrap(), expected_output);
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0]) r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %) |> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is |> lineTo([1, 1], %) /* this is
@ -1946,7 +1940,7 @@ const key = 'c'"#,
#[test] #[test]
fn test_collect_object_keys() { fn test_collect_object_keys() {
let tokens = crate::tokeniser::lexer("const prop = yo.one[\"two\"]"); let tokens = crate::token::lexer("const prop = yo.one[\"two\"]");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let keys_info = parser.collect_object_keys(6, None, false).unwrap(); let keys_info = parser.collect_object_keys(6, None, false).unwrap();
assert_eq!(keys_info.len(), 2); assert_eq!(keys_info.len(), 2);
@ -1966,7 +1960,7 @@ const key = 'c'"#,
#[test] #[test]
fn test_make_literal_call_expression() { fn test_make_literal_call_expression() {
let tokens = crate::tokeniser::lexer("log(5, \"hello\", aIdentifier)"); let tokens = crate::token::lexer("log(5, \"hello\", aIdentifier)");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let literal = parser.make_literal(2).unwrap(); let literal = parser.make_literal(2).unwrap();
assert_eq!( assert_eq!(
@ -1990,74 +1984,88 @@ const key = 'c'"#,
); );
} }
#[test]
fn test_is_code_token() {
let tokens = [
Token {
token_type: TokenType::Word,
start: 0,
end: 3,
value: "log".to_string(),
},
Token {
token_type: TokenType::Brace,
start: 3,
end: 4,
value: "(".to_string(),
},
Token {
token_type: TokenType::Number,
start: 4,
end: 5,
value: "5".to_string(),
},
Token {
token_type: TokenType::Comma,
start: 5,
end: 6,
value: ",".to_string(),
},
Token {
token_type: TokenType::String,
start: 7,
end: 14,
value: "\"hello\"".to_string(),
},
Token {
token_type: TokenType::Word,
start: 16,
end: 27,
value: "aIdentifier".to_string(),
},
Token {
token_type: TokenType::Brace,
start: 27,
end: 28,
value: ")".to_string(),
},
];
for (i, token) in tokens.iter().enumerate() {
assert!(token.is_code_token(), "failed test {i}: {token:?}")
}
}
#[test] #[test]
fn test_is_not_code_token() { fn test_is_not_code_token() {
assert!(!is_not_code_token(&Token { let tokens = [
token_type: TokenType::Word, Token {
start: 0, token_type: TokenType::Whitespace,
end: 3, start: 6,
value: "log".to_string(), end: 7,
})); value: " ".to_string(),
assert!(!is_not_code_token(&Token { },
token_type: TokenType::Brace, Token {
start: 3, token_type: TokenType::BlockComment,
end: 4, start: 28,
value: "(".to_string(), end: 30,
})); value: "/* abte */".to_string(),
assert!(!is_not_code_token(&Token { },
token_type: TokenType::Number, Token {
start: 4, token_type: TokenType::LineComment,
end: 5, start: 30,
value: "5".to_string(), end: 33,
})); value: "// yoyo a line".to_string(),
assert!(!is_not_code_token(&Token { },
token_type: TokenType::Comma, ];
start: 5, for (i, token) in tokens.iter().enumerate() {
end: 6, assert!(!token.is_code_token(), "failed test {i}: {token:?}")
value: ",".to_string(), }
}));
assert!(is_not_code_token(&Token {
token_type: TokenType::Whitespace,
start: 6,
end: 7,
value: " ".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::String,
start: 7,
end: 14,
value: "\"hello\"".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::Word,
start: 16,
end: 27,
value: "aIdentifier".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::Brace,
start: 27,
end: 28,
value: ")".to_string(),
}));
assert!(is_not_code_token(&Token {
token_type: TokenType::BlockComment,
start: 28,
end: 30,
value: "/* abte */".to_string(),
}));
assert!(is_not_code_token(&Token {
token_type: TokenType::LineComment,
start: 30,
end: 33,
value: "// yoyo a line".to_string(),
}));
} }
#[test] #[test]
fn test_next_meaningful_token() { fn test_next_meaningful_token() {
let _offset = 1; let _offset = 1;
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0]) r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %) |> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is |> lineTo([1, 1], %) /* this is
@ -2443,7 +2451,7 @@ const key = 'c'"#,
#[test] #[test]
fn test_find_closing_brace() { fn test_find_closing_brace() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0]) r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %) |> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is |> lineTo([1, 1], %) /* this is
@ -2460,16 +2468,16 @@ const key = 'c'"#,
assert_eq!(parser.find_closing_brace(90, 0, "").unwrap(), 92); assert_eq!(parser.find_closing_brace(90, 0, "").unwrap(), 92);
let basic = "( hey )"; let basic = "( hey )";
let parser = Parser::new(crate::tokeniser::lexer(basic)); let parser = Parser::new(crate::token::lexer(basic));
assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 4); assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 4);
let handles_non_zero_index = "(indexForBracketToRightOfThisIsTwo(shouldBeFour)AndNotThisSix)"; let handles_non_zero_index = "(indexForBracketToRightOfThisIsTwo(shouldBeFour)AndNotThisSix)";
let parser = Parser::new(crate::tokeniser::lexer(handles_non_zero_index)); let parser = Parser::new(crate::token::lexer(handles_non_zero_index));
assert_eq!(parser.find_closing_brace(2, 0, "").unwrap(), 4); assert_eq!(parser.find_closing_brace(2, 0, "").unwrap(), 4);
assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 6); assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 6);
let handles_nested = "{a{b{c(}d]}eathou athoeu tah u} thatOneToTheLeftIsLast }"; let handles_nested = "{a{b{c(}d]}eathou athoeu tah u} thatOneToTheLeftIsLast }";
let parser = Parser::new(crate::tokeniser::lexer(handles_nested)); let parser = Parser::new(crate::token::lexer(handles_nested));
assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 18); assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 18);
// TODO expect error when not started on a brace // TODO expect error when not started on a brace
@ -2477,7 +2485,7 @@ const key = 'c'"#,
#[test] #[test]
fn test_is_call_expression() { fn test_is_call_expression() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0]) r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %) |> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is |> lineTo([1, 1], %) /* this is
@ -2498,7 +2506,7 @@ const key = 'c'"#,
#[test] #[test]
fn test_find_next_declaration_keyword() { fn test_find_next_declaration_keyword() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0]) r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %) |> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is |> lineTo([1, 1], %) /* this is
@ -2513,7 +2521,7 @@ const key = 'c'"#,
TokenReturn { token: None, index: 92 } TokenReturn { token: None, index: 92 }
); );
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const myVar = 5 r#"const myVar = 5
const newVar = myVar + 1 const newVar = myVar + 1
"#, "#,
@ -2543,7 +2551,7 @@ const newVar = myVar + 1
lineTo(2, 3) lineTo(2, 3)
} |> rx(45, %) } |> rx(45, %)
"#; "#;
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
assert_eq!( assert_eq!(
parser.has_pipe_operator(0, None).unwrap(), parser.has_pipe_operator(0, None).unwrap(),
@ -2562,7 +2570,7 @@ const newVar = myVar + 1
lineTo(2, 3) lineTo(2, 3)
} |> rx(45, %) |> rx(45, %) } |> rx(45, %) |> rx(45, %)
"#; "#;
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
assert_eq!( assert_eq!(
parser.has_pipe_operator(0, None).unwrap(), parser.has_pipe_operator(0, None).unwrap(),
@ -2584,7 +2592,7 @@ const newVar = myVar + 1
const yo = myFunc(9() const yo = myFunc(9()
|> rx(45, %) |> rx(45, %)
"#; "#;
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
assert_eq!( assert_eq!(
parser.has_pipe_operator(0, None).unwrap(), parser.has_pipe_operator(0, None).unwrap(),
@ -2596,7 +2604,7 @@ const yo = myFunc(9()
); );
let code = "const myVar2 = 5 + 1 |> myFn(%)"; let code = "const myVar2 = 5 + 1 |> myFn(%)";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
assert_eq!( assert_eq!(
parser.has_pipe_operator(1, None).unwrap(), parser.has_pipe_operator(1, None).unwrap(),
@ -2618,7 +2626,7 @@ const yo = myFunc(9()
lineTo(1,1) lineTo(1,1)
} |> rx(90, %) } |> rx(90, %)
show(mySk1)"#; show(mySk1)"#;
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let token_with_my_path_index = tokens.iter().position(|token| token.value == "myPath").unwrap(); let token_with_my_path_index = tokens.iter().position(|token| token.value == "myPath").unwrap();
// loop through getting the token and it's index // loop through getting the token and it's index
@ -2658,7 +2666,7 @@ show(mySk1)"#;
#[test] #[test]
fn test_make_member_expression() { fn test_make_member_expression() {
let tokens = crate::tokeniser::lexer("const prop = yo.one[\"two\"]"); let tokens = crate::token::lexer("const prop = yo.one[\"two\"]");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let member_expression_return = parser.make_member_expression(6).unwrap(); let member_expression_return = parser.make_member_expression(6).unwrap();
let member_expression = member_expression_return.expression; let member_expression = member_expression_return.expression;
@ -2700,63 +2708,63 @@ show(mySk1)"#;
#[test] #[test]
fn test_find_end_of_binary_expression() { fn test_find_end_of_binary_expression() {
let code = "1 + 2 * 3\nconst yo = 5"; let code = "1 + 2 * 3\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap(); let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].value, "3"); assert_eq!(tokens[end].value, "3");
let code = "(1 + 25) / 5 - 3\nconst yo = 5"; let code = "(1 + 25) / 5 - 3\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap(); let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].value, "3"); assert_eq!(tokens[end].value, "3");
let index_of_5 = code.find('5').unwrap(); let index_of_5 = code.find('5').unwrap();
let end_starting_at_the_5 = parser.find_end_of_binary_expression(index_of_5).unwrap(); let end_starting_at_the_5 = parser.find_end_of_binary_expression(index_of_5).unwrap();
assert_eq!(end_starting_at_the_5, end); assert_eq!(end_starting_at_the_5, end);
// whole thing wraped // whole thing wrapped
let code = "((1 + 2) / 5 - 3)\nconst yo = 5"; let code = "((1 + 2) / 5 - 3)\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap(); let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].end, code.find("3)").unwrap() + 2); assert_eq!(tokens[end].end, code.find("3)").unwrap() + 2);
// whole thing wraped but given index after the first brace // whole thing wrapped but given index after the first brace
let code = "((1 + 2) / 5 - 3)\nconst yo = 5"; let code = "((1 + 2) / 5 - 3)\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(1).unwrap(); let end = parser.find_end_of_binary_expression(1).unwrap();
assert_eq!(tokens[end].value, "3"); assert_eq!(tokens[end].value, "3");
// given the index of a small wrapped section i.e. `1 + 2` in ((1 + 2) / 5 - 3)' // given the index of a small wrapped section i.e. `1 + 2` in ((1 + 2) / 5 - 3)'
let code = "((1 + 2) / 5 - 3)\nconst yo = 5"; let code = "((1 + 2) / 5 - 3)\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(2).unwrap(); let end = parser.find_end_of_binary_expression(2).unwrap();
assert_eq!(tokens[end].value, "2"); assert_eq!(tokens[end].value, "2");
// lots of silly nesting // lots of silly nesting
let code = "(1 + 2) / (5 - (3))\nconst yo = 5"; let code = "(1 + 2) / (5 - (3))\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap(); let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].end, code.find("))").unwrap() + 2); assert_eq!(tokens[end].end, code.find("))").unwrap() + 2);
// with pipe operator at the end // with pipe operator at the end
let code = "(1 + 2) / (5 - (3))\n |> fn(%)"; let code = "(1 + 2) / (5 - (3))\n |> fn(%)";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap(); let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].end, code.find("))").unwrap() + 2); assert_eq!(tokens[end].end, code.find("))").unwrap() + 2);
// with call expression at the start of binary expression // with call expression at the start of binary expression
let code = "yo(2) + 3\n |> fn(%)"; let code = "yo(2) + 3\n |> fn(%)";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap(); let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].value, "3"); assert_eq!(tokens[end].value, "3");
// with call expression at the end of binary expression // with call expression at the end of binary expression
let code = "3 + yo(2)\n |> fn(%)"; let code = "3 + yo(2)\n |> fn(%)";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let _end = parser.find_end_of_binary_expression(0).unwrap(); let _end = parser.find_end_of_binary_expression(0).unwrap();
// with call expression at the end of binary expression // with call expression at the end of binary expression
let code = "-legX + 2, "; let code = "-legX + 2, ";
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone()); let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap(); let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].value, "2"); assert_eq!(tokens[end].value, "2");
@ -2765,7 +2773,7 @@ show(mySk1)"#;
#[test] #[test]
fn test_make_array_expression() { fn test_make_array_expression() {
// input_index: 6, output_index: 14, output: {"type":"ArrayExpression","start":11,"end":26,"elements":[{"type":"Literal","start":12,"end":15,"value":"1","raw":"\"1\""},{"type":"Literal","start":17,"end":18,"value":2,"raw":"2"},{"type":"Identifier","start":20,"end":25,"name":"three"}]} // input_index: 6, output_index: 14, output: {"type":"ArrayExpression","start":11,"end":26,"elements":[{"type":"Literal","start":12,"end":15,"value":"1","raw":"\"1\""},{"type":"Literal","start":17,"end":18,"value":2,"raw":"2"},{"type":"Identifier","start":20,"end":25,"name":"three"}]}
let tokens = crate::tokeniser::lexer("const yo = [\"1\", 2, three]"); let tokens = crate::token::lexer("const yo = [\"1\", 2, three]");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let array_expression = parser.make_array_expression(6).unwrap(); let array_expression = parser.make_array_expression(6).unwrap();
let expression = array_expression.expression; let expression = array_expression.expression;
@ -2804,7 +2812,7 @@ show(mySk1)"#;
#[test] #[test]
fn test_make_call_expression() { fn test_make_call_expression() {
let tokens = crate::tokeniser::lexer("foo(\"a\", a, 3)"); let tokens = crate::token::lexer("foo(\"a\", a, 3)");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.make_call_expression(0).unwrap(); let result = parser.make_call_expression(0).unwrap();
assert_eq!(result.last_index, 9); assert_eq!(result.last_index, 9);
@ -2838,7 +2846,7 @@ show(mySk1)"#;
#[test] #[test]
fn test_make_variable_declaration() { fn test_make_variable_declaration() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const yo = startSketch([0, 0]) r#"const yo = startSketch([0, 0])
|> lineTo([1, myVar], %) |> lineTo([1, myVar], %)
|> foo(myVar2, %) |> foo(myVar2, %)
@ -2908,7 +2916,7 @@ show(mySk1)"#;
#[test] #[test]
fn test_make_body() { fn test_make_body() {
let tokens = crate::tokeniser::lexer("const myVar = 5"); let tokens = crate::token::lexer("const myVar = 5");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let body = parser let body = parser
.make_body( .make_body(
@ -2926,7 +2934,7 @@ show(mySk1)"#;
#[test] #[test]
fn test_abstract_syntax_tree() { fn test_abstract_syntax_tree() {
let code = "5 +6"; let code = "5 +6";
let parser = Parser::new(crate::tokeniser::lexer(code)); let parser = Parser::new(crate::token::lexer(code));
let result = parser.ast().unwrap(); let result = parser.ast().unwrap();
let expected_result = Program { let expected_result = Program {
start: 0, start: 0,
@ -2964,8 +2972,8 @@ show(mySk1)"#;
#[test] #[test]
fn test_empty_file() { fn test_empty_file() {
let some_program_string = r#""#; let some_program_string = r#""#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert!(result.err().unwrap().to_string().contains("file is empty")); assert!(result.err().unwrap().to_string().contains("file is empty"));
@ -2973,7 +2981,7 @@ show(mySk1)"#;
#[test] #[test]
fn test_parse_half_pipe_small() { fn test_parse_half_pipe_small() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
"const secondExtrude = startSketchAt([0,0]) "const secondExtrude = startSketchAt([0,0])
|", |",
); );
@ -2985,14 +2993,14 @@ show(mySk1)"#;
#[test] #[test]
fn test_parse_member_expression_double_nested_braces() { fn test_parse_member_expression_double_nested_braces() {
let tokens = crate::tokeniser::lexer(r#"const prop = yo["one"][two]"#); let tokens = crate::token::lexer(r#"const prop = yo["one"][two]"#);
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
parser.ast().unwrap(); parser.ast().unwrap();
} }
#[test] #[test]
fn test_parse_member_expression_binary_expression_period_number_first() { fn test_parse_member_expression_binary_expression_period_number_first() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 } r#"const obj = { a: 1, b: 2 }
const height = 1 - obj.a"#, const height = 1 - obj.a"#,
); );
@ -3002,7 +3010,7 @@ const height = 1 - obj.a"#,
#[test] #[test]
fn test_parse_member_expression_binary_expression_brace_number_first() { fn test_parse_member_expression_binary_expression_brace_number_first() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 } r#"const obj = { a: 1, b: 2 }
const height = 1 - obj["a"]"#, const height = 1 - obj["a"]"#,
); );
@ -3012,7 +3020,7 @@ const height = 1 - obj["a"]"#,
#[test] #[test]
fn test_parse_member_expression_binary_expression_brace_number_second() { fn test_parse_member_expression_binary_expression_brace_number_second() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 } r#"const obj = { a: 1, b: 2 }
const height = obj["a"] - 1"#, const height = obj["a"] - 1"#,
); );
@ -3022,7 +3030,7 @@ const height = obj["a"] - 1"#,
#[test] #[test]
fn test_parse_member_expression_binary_expression_in_array_number_first() { fn test_parse_member_expression_binary_expression_in_array_number_first() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 } r#"const obj = { a: 1, b: 2 }
const height = [1 - obj["a"], 0]"#, const height = [1 - obj["a"], 0]"#,
); );
@ -3032,7 +3040,7 @@ const height = [1 - obj["a"], 0]"#,
#[test] #[test]
fn test_parse_member_expression_binary_expression_in_array_number_second() { fn test_parse_member_expression_binary_expression_in_array_number_second() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 } r#"const obj = { a: 1, b: 2 }
const height = [obj["a"] - 1, 0]"#, const height = [obj["a"] - 1, 0]"#,
); );
@ -3042,7 +3050,7 @@ const height = [obj["a"] - 1, 0]"#,
#[test] #[test]
fn test_parse_member_expression_binary_expression_in_array_number_second_missing_space() { fn test_parse_member_expression_binary_expression_in_array_number_second_missing_space() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 } r#"const obj = { a: 1, b: 2 }
const height = [obj["a"] -1, 0]"#, const height = [obj["a"] -1, 0]"#,
); );
@ -3052,7 +3060,7 @@ const height = [obj["a"] -1, 0]"#,
#[test] #[test]
fn test_parse_half_pipe() { fn test_parse_half_pipe() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
"const height = 10 "const height = 10
const firstExtrude = startSketchAt([0,0]) const firstExtrude = startSketchAt([0,0])
@ -3075,15 +3083,17 @@ const secondExtrude = startSketchAt([0,0])
#[test] #[test]
fn test_parse_greater_bang() { fn test_parse_greater_bang() {
let tokens = crate::tokeniser::lexer(">!"); let tokens = crate::token::lexer(">!");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let err = parser.ast().unwrap_err();
assert!(result.is_ok()); // TODO: Better errors when program cannot tokenize.
// https://github.com/KittyCAD/modeling-app/issues/696
assert!(err.to_string().contains("file is empty"));
} }
#[test] #[test]
fn test_parse_z_percent_parens() { fn test_parse_z_percent_parens() {
let tokens = crate::tokeniser::lexer("z%)"); let tokens = crate::token::lexer("z%)");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
@ -3092,15 +3102,17 @@ const secondExtrude = startSketchAt([0,0])
#[test] #[test]
fn test_parse_parens_unicode() { fn test_parse_parens_unicode() {
let tokens = crate::tokeniser::lexer(""); let tokens = crate::token::lexer("");
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_ok()); // TODO: Better errors when program cannot tokenize.
// https://github.com/KittyCAD/modeling-app/issues/696
assert!(result.is_err());
} }
#[test] #[test]
fn test_parse_negative_in_array_binary_expression() { fn test_parse_negative_in_array_binary_expression() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"const leg1 = 5 r#"const leg1 = 5
const thickness = 0.56 const thickness = 0.56
@ -3114,7 +3126,7 @@ const bracket = [-leg2 + thickness, 0]
#[test] #[test]
fn test_parse_nested_open_brackets() { fn test_parse_nested_open_brackets() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#" r#"
z(-[["#, z(-[["#,
); );
@ -3129,31 +3141,38 @@ z(-[["#,
#[test] #[test]
fn test_parse_weird_new_line_function() { fn test_parse_weird_new_line_function() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"z r#"z
(--#"#, (--#"#,
); );
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
// TODO: Better errors when program cannot tokenize.
// https://github.com/KittyCAD/modeling-app/issues/696
assert_eq!( assert_eq!(
result.err().unwrap().to_string(), result.err().unwrap().to_string(),
r#"syntax: KclErrorDetails { source_ranges: [SourceRange([0, 1])], message: "missing a closing brace for the function call" }"# r#"semantic: KclErrorDetails { source_ranges: [], message: "file is empty" }"#
); );
} }
#[test] #[test]
fn test_parse_weird_lots_of_fancy_brackets() { fn test_parse_weird_lots_of_fancy_brackets() {
let tokens = crate::tokeniser::lexer(r#"zz({{{{{{{{)iegAng{{{{{{{##"#); let tokens = crate::token::lexer(r#"zz({{{{{{{{)iegAng{{{{{{{##"#);
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert!(result.err().unwrap().to_string().contains("unexpected end")); // TODO: Better errors when program cannot tokenize.
// https://github.com/KittyCAD/modeling-app/issues/696
assert_eq!(
result.err().unwrap().to_string(),
r#"semantic: KclErrorDetails { source_ranges: [], message: "file is empty" }"#
);
} }
#[test] #[test]
fn test_parse_weird_close_before_open() { fn test_parse_weird_close_before_open() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"fn)n r#"fn)n
e e
["#, ["#,
@ -3170,7 +3189,7 @@ e
#[test] #[test]
fn test_parse_weird_close_before_nada() { fn test_parse_weird_close_before_nada() {
let tokens = crate::tokeniser::lexer(r#"fn)n-"#); let tokens = crate::token::lexer(r#"fn)n-"#);
let parser = Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
@ -3179,7 +3198,7 @@ e
#[test] #[test]
fn test_parse_weird_lots_of_slashes() { fn test_parse_weird_lots_of_slashes() {
let tokens = crate::tokeniser::lexer( let tokens = crate::token::lexer(
r#"J///////////o//+///////////P++++*++++++P///////˟ r#"J///////////o//+///////////P++++*++++++P///////˟
++4"#, ++4"#,
); );
@ -3196,7 +3215,7 @@ e
#[test] #[test]
fn test_parse_expand_array() { fn test_parse_expand_array() {
let code = "const myArray = [0..10]"; let code = "const myArray = [0..10]";
let parser = Parser::new(crate::tokeniser::lexer(code)); let parser = Parser::new(crate::token::lexer(code));
let result = parser.ast().unwrap(); let result = parser.ast().unwrap();
let expected_result = Program { let expected_result = Program {
start: 0, start: 0,
@ -3299,8 +3318,8 @@ e
#[test] #[test]
fn test_error_keyword_in_variable() { fn test_error_keyword_in_variable() {
let some_program_string = r#"const let = "thing""#; let some_program_string = r#"const let = "thing""#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert_eq!( assert_eq!(
@ -3312,8 +3331,8 @@ e
#[test] #[test]
fn test_error_keyword_in_fn_name() { fn test_error_keyword_in_fn_name() {
let some_program_string = r#"fn let = () {}"#; let some_program_string = r#"fn let = () {}"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert_eq!( assert_eq!(
@ -3325,8 +3344,8 @@ e
#[test] #[test]
fn test_error_stdlib_in_fn_name() { fn test_error_stdlib_in_fn_name() {
let some_program_string = r#"fn cos = () {}"#; let some_program_string = r#"fn cos = () {}"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert_eq!( assert_eq!(
@ -3340,8 +3359,8 @@ e
let some_program_string = r#"fn thing = (let) => { let some_program_string = r#"fn thing = (let) => {
return 1 return 1
}"#; }"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert_eq!( assert_eq!(
@ -3355,8 +3374,8 @@ e
let some_program_string = r#"fn thing = (cos) => { let some_program_string = r#"fn thing = (cos) => {
return 1 return 1
}"#; }"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert_eq!( assert_eq!(
@ -3373,8 +3392,8 @@ e
} }
firstPrimeNumber() firstPrimeNumber()
"#; "#;
let tokens = crate::tokeniser::lexer(program); let tokens = crate::token::lexer(program);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let _ast = parser.ast().unwrap(); let _ast = parser.ast().unwrap();
} }
@ -3386,8 +3405,8 @@ e
thing(false) thing(false)
"#; "#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
parser.ast().unwrap(); parser.ast().unwrap();
} }
@ -3403,8 +3422,8 @@ thing(false)
"#, "#,
name name
); );
let tokens = crate::tokeniser::lexer(&some_program_string); let tokens = crate::token::lexer(&some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert_eq!( assert_eq!(
@ -3421,8 +3440,8 @@ thing(false)
#[test] #[test]
fn test_error_define_var_as_function() { fn test_error_define_var_as_function() {
let some_program_string = r#"fn thing = "thing""#; let some_program_string = r#"fn thing = "thing""#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
let result = parser.ast(); let result = parser.ast();
assert!(result.is_err()); assert!(result.is_err());
assert_eq!( assert_eq!(
@ -3450,8 +3469,8 @@ const pt2 = b2[0]
show(b1) show(b1)
show(b2)"#; show(b2)"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
parser.ast().unwrap(); parser.ast().unwrap();
} }
@ -3459,18 +3478,36 @@ show(b2)"#;
fn test_math_with_stdlib() { fn test_math_with_stdlib() {
let some_program_string = r#"const d2r = pi() / 2 let some_program_string = r#"const d2r = pi() / 2
let other_thing = 2 * cos(3)"#; let other_thing = 2 * cos(3)"#;
let tokens = crate::tokeniser::lexer(some_program_string); let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
parser.ast().unwrap(); parser.ast().unwrap();
} }
#[test] #[test]
#[ignore] // ignore until more stack fixes
fn test_parse_pipes_on_pipes() { fn test_parse_pipes_on_pipes() {
let code = include_str!("../../tests/executor/inputs/pipes_on_pipes.kcl"); let code = include_str!("../../tests/executor/inputs/pipes_on_pipes.kcl");
let tokens = crate::tokeniser::lexer(code); let tokens = crate::token::lexer(code);
let parser = crate::parser::Parser::new(tokens); let parser = Parser::new(tokens);
parser.ast().unwrap();
}
#[test]
fn test_negative_arguments() {
let some_program_string = r#"fn box = (p, h, l, w) => {
const myBox = startSketchAt(p)
|> line([0, l], %)
|> line([w, 0], %)
|> line([0, -l], %)
|> close(%)
|> extrude(h, %)
return myBox
}
let myBox = box([0,0], -3, -16, -10)
show(myBox)"#;
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
parser.ast().unwrap(); parser.ast().unwrap();
} }
} }

View File

@ -34,7 +34,7 @@ pub struct Backend {
/// The types of tokens the server supports. /// The types of tokens the server supports.
pub token_types: Vec<SemanticTokenType>, pub token_types: Vec<SemanticTokenType>,
/// Token maps. /// Token maps.
pub token_map: DashMap<String, Vec<crate::tokeniser::Token>>, pub token_map: DashMap<String, Vec<crate::token::Token>>,
/// AST maps. /// AST maps.
pub ast_map: DashMap<String, crate::ast::types::Program>, pub ast_map: DashMap<String, crate::ast::types::Program>,
/// Current code. /// Current code.
@ -56,7 +56,7 @@ impl Backend {
// Lets update the tokens. // Lets update the tokens.
self.current_code_map self.current_code_map
.insert(params.uri.to_string(), params.text.clone()); .insert(params.uri.to_string(), params.text.clone());
let tokens = crate::tokeniser::lexer(&params.text); let tokens = crate::token::lexer(&params.text);
self.token_map.insert(params.uri.to_string(), tokens.clone()); self.token_map.insert(params.uri.to_string(), tokens.clone());
// Update the semantic tokens map. // Update the semantic tokens map.
@ -69,9 +69,7 @@ impl Backend {
continue; continue;
}; };
if token.token_type == crate::tokeniser::TokenType::Word if token.token_type == crate::token::TokenType::Word && self.stdlib_completions.contains_key(&token.value) {
&& self.stdlib_completions.contains_key(&token.value)
{
// This is a stdlib function. // This is a stdlib function.
token_type = SemanticTokenType::FUNCTION; token_type = SemanticTokenType::FUNCTION;
} }
@ -549,7 +547,7 @@ impl LanguageServer for Backend {
// Parse the ast. // Parse the ast.
// I don't know if we need to do this again since it should be updated in the context. // I don't know if we need to do this again since it should be updated in the context.
// But I figure better safe than sorry since this will write back out to the file. // But I figure better safe than sorry since this will write back out to the file.
let tokens = crate::tokeniser::lexer(&current_code); let tokens = crate::token::lexer(&current_code);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let Ok(ast) = parser.ast() else { let Ok(ast) = parser.ast() else {
return Ok(None); return Ok(None);
@ -581,7 +579,7 @@ impl LanguageServer for Backend {
// Parse the ast. // Parse the ast.
// I don't know if we need to do this again since it should be updated in the context. // I don't know if we need to do this again since it should be updated in the context.
// But I figure better safe than sorry since this will write back out to the file. // But I figure better safe than sorry since this will write back out to the file.
let tokens = crate::tokeniser::lexer(&current_code); let tokens = crate::token::lexer(&current_code);
let parser = crate::parser::Parser::new(tokens); let parser = crate::parser::Parser::new(tokens);
let Ok(mut ast) = parser.ast() else { let Ok(mut ast) = parser.ast() else {
return Ok(None); return Ok(None);

View File

@ -11,10 +11,10 @@ use crate::{
}; };
/// Extrudes by a given amount. /// Extrudes by a given amount.
pub fn extrude(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn extrude(args: Args) -> Result<MemoryItem, KclError> {
let (length, sketch_group) = args.get_number_sketch_group()?; let (length, sketch_group) = args.get_number_sketch_group()?;
let result = inner_extrude(length, sketch_group, args)?; let result = inner_extrude(length, sketch_group, args).await?;
Ok(MemoryItem::ExtrudeGroup(result)) Ok(MemoryItem::ExtrudeGroup(result))
} }
@ -23,7 +23,7 @@ pub fn extrude(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "extrude" name = "extrude"
}] }]
fn inner_extrude(length: f64, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<ExtrudeGroup>, KclError> { async fn inner_extrude(length: f64, sketch_group: Box<SketchGroup>, args: Args) -> Result<Box<ExtrudeGroup>, KclError> {
let id = uuid::Uuid::new_v4(); let id = uuid::Uuid::new_v4();
let cmd = kittycad::types::ModelingCmd::Extrude { let cmd = kittycad::types::ModelingCmd::Extrude {
@ -31,7 +31,7 @@ fn inner_extrude(length: f64, sketch_group: Box<SketchGroup>, args: &mut Args) -
distance: length, distance: length,
cap: true, cap: true,
}; };
args.send_modeling_cmd(id, cmd)?; args.send_modeling_cmd(id, cmd).await?;
Ok(Box::new(ExtrudeGroup { Ok(Box::new(ExtrudeGroup {
id, id,
@ -46,7 +46,7 @@ fn inner_extrude(length: f64, sketch_group: Box<SketchGroup>, args: &mut Args) -
} }
/// Returns the extrude wall transform. /// Returns the extrude wall transform.
pub fn get_extrude_wall_transform(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn get_extrude_wall_transform(args: Args) -> Result<MemoryItem, KclError> {
let (surface_name, extrude_group) = args.get_path_name_extrude_group()?; let (surface_name, extrude_group) = args.get_path_name_extrude_group()?;
let result = inner_get_extrude_wall_transform(&surface_name, *extrude_group, args)?; let result = inner_get_extrude_wall_transform(&surface_name, *extrude_group, args)?;
Ok(MemoryItem::ExtrudeTransform(result)) Ok(MemoryItem::ExtrudeTransform(result))
@ -59,8 +59,8 @@ pub fn get_extrude_wall_transform(args: &mut Args) -> Result<MemoryItem, KclErro
fn inner_get_extrude_wall_transform( fn inner_get_extrude_wall_transform(
surface_name: &str, surface_name: &str,
extrude_group: ExtrudeGroup, extrude_group: ExtrudeGroup,
args: &mut Args, args: Args,
) -> Result<ExtrudeTransform, KclError> { ) -> Result<Box<ExtrudeTransform>, KclError> {
let surface = extrude_group.get_path_by_name(surface_name).ok_or_else(|| { let surface = extrude_group.get_path_by_name(surface_name).ok_or_else(|| {
KclError::Type(KclErrorDetails { KclError::Type(KclErrorDetails {
message: format!( message: format!(
@ -71,9 +71,9 @@ fn inner_get_extrude_wall_transform(
}) })
})?; })?;
Ok(ExtrudeTransform { Ok(Box::new(ExtrudeTransform {
position: surface.get_position(), position: surface.get_position(),
rotation: surface.get_rotation(), rotation: surface.get_rotation(),
meta: extrude_group.meta, meta: extrude_group.meta,
}) }))
} }

View File

@ -11,7 +11,7 @@ use crate::{
}; };
/// Computes the cosine of a number (in radians). /// Computes the cosine of a number (in radians).
pub fn cos(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn cos(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_cos(num)?; let result = inner_cos(num)?;
@ -27,7 +27,7 @@ fn inner_cos(num: f64) -> Result<f64, KclError> {
} }
/// Computes the sine of a number (in radians). /// Computes the sine of a number (in radians).
pub fn sin(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn sin(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_sin(num)?; let result = inner_sin(num)?;
@ -43,7 +43,7 @@ fn inner_sin(num: f64) -> Result<f64, KclError> {
} }
/// Computes the tangent of a number (in radians). /// Computes the tangent of a number (in radians).
pub fn tan(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn tan(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_tan(num)?; let result = inner_tan(num)?;
@ -59,7 +59,7 @@ fn inner_tan(num: f64) -> Result<f64, KclError> {
} }
/// Return the value of `pi`. Archimedes constant (π). /// Return the value of `pi`. Archimedes constant (π).
pub fn pi(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn pi(args: Args) -> Result<MemoryItem, KclError> {
let result = inner_pi()?; let result = inner_pi()?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
@ -74,7 +74,7 @@ fn inner_pi() -> Result<f64, KclError> {
} }
/// Computes the square root of a number. /// Computes the square root of a number.
pub fn sqrt(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn sqrt(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_sqrt(num)?; let result = inner_sqrt(num)?;
@ -90,7 +90,7 @@ fn inner_sqrt(num: f64) -> Result<f64, KclError> {
} }
/// Computes the absolute value of a number. /// Computes the absolute value of a number.
pub fn abs(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn abs(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_abs(num)?; let result = inner_abs(num)?;
@ -106,7 +106,7 @@ fn inner_abs(num: f64) -> Result<f64, KclError> {
} }
/// Computes the largest integer less than or equal to a number. /// Computes the largest integer less than or equal to a number.
pub fn floor(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn floor(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_floor(num)?; let result = inner_floor(num)?;
@ -122,7 +122,7 @@ fn inner_floor(num: f64) -> Result<f64, KclError> {
} }
/// Computes the smallest integer greater than or equal to a number. /// Computes the smallest integer greater than or equal to a number.
pub fn ceil(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn ceil(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_ceil(num)?; let result = inner_ceil(num)?;
@ -138,7 +138,7 @@ fn inner_ceil(num: f64) -> Result<f64, KclError> {
} }
/// Computes the minimum of the given arguments. /// Computes the minimum of the given arguments.
pub fn min(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn min(args: Args) -> Result<MemoryItem, KclError> {
let nums = args.get_number_array()?; let nums = args.get_number_array()?;
let result = inner_min(nums); let result = inner_min(nums);
@ -161,7 +161,7 @@ fn inner_min(args: Vec<f64>) -> f64 {
} }
/// Computes the maximum of the given arguments. /// Computes the maximum of the given arguments.
pub fn max(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn max(args: Args) -> Result<MemoryItem, KclError> {
let nums = args.get_number_array()?; let nums = args.get_number_array()?;
let result = inner_max(nums); let result = inner_max(nums);
@ -184,7 +184,7 @@ fn inner_max(args: Vec<f64>) -> f64 {
} }
/// Computes the number to a power. /// Computes the number to a power.
pub fn pow(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn pow(args: Args) -> Result<MemoryItem, KclError> {
let nums = args.get_number_array()?; let nums = args.get_number_array()?;
if nums.len() > 2 { if nums.len() > 2 {
return Err(KclError::Type(KclErrorDetails { return Err(KclError::Type(KclErrorDetails {
@ -214,7 +214,7 @@ fn inner_pow(num: f64, pow: f64) -> Result<f64, KclError> {
} }
/// Computes the arccosine of a number (in radians). /// Computes the arccosine of a number (in radians).
pub fn acos(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn acos(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_acos(num)?; let result = inner_acos(num)?;
@ -230,7 +230,7 @@ fn inner_acos(num: f64) -> Result<f64, KclError> {
} }
/// Computes the arcsine of a number (in radians). /// Computes the arcsine of a number (in radians).
pub fn asin(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn asin(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_asin(num)?; let result = inner_asin(num)?;
@ -246,7 +246,7 @@ fn inner_asin(num: f64) -> Result<f64, KclError> {
} }
/// Computes the arctangent of a number (in radians). /// Computes the arctangent of a number (in radians).
pub fn atan(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn atan(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_atan(num)?; let result = inner_atan(num)?;
@ -266,7 +266,7 @@ fn inner_atan(num: f64) -> Result<f64, KclError> {
/// The result might not be correctly rounded owing to implementation /// The result might not be correctly rounded owing to implementation
/// details; `log2()` can produce more accurate results for base 2, /// details; `log2()` can produce more accurate results for base 2,
/// and `log10()` can produce more accurate results for base 10. /// and `log10()` can produce more accurate results for base 10.
pub fn log(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn log(args: Args) -> Result<MemoryItem, KclError> {
let nums = args.get_number_array()?; let nums = args.get_number_array()?;
if nums.len() > 2 { if nums.len() > 2 {
return Err(KclError::Type(KclErrorDetails { return Err(KclError::Type(KclErrorDetails {
@ -299,7 +299,7 @@ fn inner_log(num: f64, base: f64) -> Result<f64, KclError> {
} }
/// Computes the base 2 logarithm of the number. /// Computes the base 2 logarithm of the number.
pub fn log2(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn log2(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_log2(num)?; let result = inner_log2(num)?;
@ -315,7 +315,7 @@ fn inner_log2(num: f64) -> Result<f64, KclError> {
} }
/// Computes the base 10 logarithm of the number. /// Computes the base 10 logarithm of the number.
pub fn log10(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn log10(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_log10(num)?; let result = inner_log10(num)?;
@ -331,7 +331,7 @@ fn inner_log10(num: f64) -> Result<f64, KclError> {
} }
/// Computes the natural logarithm of the number. /// Computes the natural logarithm of the number.
pub fn ln(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn ln(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?; let num = args.get_number()?;
let result = inner_ln(num)?; let result = inner_ln(num)?;
@ -347,7 +347,7 @@ fn inner_ln(num: f64) -> Result<f64, KclError> {
} }
/// Return the value of Eulers number `e`. /// Return the value of Eulers number `e`.
pub fn e(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn e(args: Args) -> Result<MemoryItem, KclError> {
let result = inner_e()?; let result = inner_e()?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
@ -362,7 +362,7 @@ fn inner_e() -> Result<f64, KclError> {
} }
/// Return the value of `tau`. The full circle constant (τ). Equal to 2π. /// Return the value of `tau`. The full circle constant (τ). Equal to 2π.
pub fn tau(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn tau(args: Args) -> Result<MemoryItem, KclError> {
let result = inner_tau()?; let result = inner_tau()?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)

View File

@ -10,6 +10,7 @@ use std::collections::HashMap;
use anyhow::Result; use anyhow::Result;
use derive_docs::stdlib; use derive_docs::stdlib;
use kittycad::types::OkWebSocketResponseData;
use parse_display::{Display, FromStr}; use parse_display::{Display, FromStr};
use schemars::JsonSchema; use schemars::JsonSchema;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
@ -21,7 +22,7 @@ use crate::{
executor::{ExtrudeGroup, MemoryItem, Metadata, SketchGroup, SourceRange}, executor::{ExtrudeGroup, MemoryItem, Metadata, SketchGroup, SourceRange},
}; };
pub type StdFn = fn(&mut Args) -> Result<MemoryItem, KclError>; pub type StdFn = fn(Args) -> std::pin::Pin<Box<dyn std::future::Future<Output = Result<MemoryItem, KclError>>>>;
pub type FnMap = HashMap<String, StdFn>; pub type FnMap = HashMap<String, StdFn>;
pub struct StdLib { pub struct StdLib {
@ -102,15 +103,15 @@ impl Default for StdLib {
} }
} }
#[derive(Debug)] #[derive(Debug, Clone)]
pub struct Args<'a> { pub struct Args {
pub args: Vec<MemoryItem>, pub args: Vec<MemoryItem>,
pub source_range: SourceRange, pub source_range: SourceRange,
engine: &'a mut EngineConnection, engine: EngineConnection,
} }
impl<'a> Args<'a> { impl Args {
pub fn new(args: Vec<MemoryItem>, source_range: SourceRange, engine: &'a mut EngineConnection) -> Self { pub fn new(args: Vec<MemoryItem>, source_range: SourceRange, engine: EngineConnection) -> Self {
Self { Self {
args, args,
source_range, source_range,
@ -118,8 +119,12 @@ impl<'a> Args<'a> {
} }
} }
pub fn send_modeling_cmd(&mut self, id: uuid::Uuid, cmd: kittycad::types::ModelingCmd) -> Result<(), KclError> { pub async fn send_modeling_cmd(
self.engine.send_modeling_cmd(id, self.source_range, cmd) &self,
id: uuid::Uuid,
cmd: kittycad::types::ModelingCmd,
) -> Result<OkWebSocketResponseData, KclError> {
self.engine.send_modeling_cmd(id, self.source_range, cmd).await
} }
fn make_user_val_from_json(&self, j: serde_json::Value) -> Result<MemoryItem, KclError> { fn make_user_val_from_json(&self, j: serde_json::Value) -> Result<MemoryItem, KclError> {
@ -443,7 +448,7 @@ impl<'a> Args<'a> {
/// Render a model. /// Render a model.
// This never actually gets called so this is fine. // This never actually gets called so this is fine.
pub fn show(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn show<'a>(args: Args) -> Result<MemoryItem, KclError> {
let sketch_group = args.get_sketch_group()?; let sketch_group = args.get_sketch_group()?;
inner_show(sketch_group); inner_show(sketch_group);
@ -457,7 +462,7 @@ pub fn show(args: &mut Args) -> Result<MemoryItem, KclError> {
fn inner_show(_sketch: Box<SketchGroup>) {} fn inner_show(_sketch: Box<SketchGroup>) {}
/// Returns the length of the given leg. /// Returns the length of the given leg.
pub fn leg_length(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn leg_length(args: Args) -> Result<MemoryItem, KclError> {
let (hypotenuse, leg) = args.get_hypotenuse_leg()?; let (hypotenuse, leg) = args.get_hypotenuse_leg()?;
let result = inner_leg_length(hypotenuse, leg); let result = inner_leg_length(hypotenuse, leg);
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
@ -472,7 +477,7 @@ fn inner_leg_length(hypotenuse: f64, leg: f64) -> f64 {
} }
/// Returns the angle of the given leg for x. /// Returns the angle of the given leg for x.
pub fn leg_angle_x(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn leg_angle_x(args: Args) -> Result<MemoryItem, KclError> {
let (hypotenuse, leg) = args.get_hypotenuse_leg()?; let (hypotenuse, leg) = args.get_hypotenuse_leg()?;
let result = inner_leg_angle_x(hypotenuse, leg); let result = inner_leg_angle_x(hypotenuse, leg);
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
@ -487,7 +492,7 @@ fn inner_leg_angle_x(hypotenuse: f64, leg: f64) -> f64 {
} }
/// Returns the angle of the given leg for y. /// Returns the angle of the given leg for y.
pub fn leg_angle_y(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn leg_angle_y(args: Args) -> Result<MemoryItem, KclError> {
let (hypotenuse, leg) = args.get_hypotenuse_leg()?; let (hypotenuse, leg) = args.get_hypotenuse_leg()?;
let result = inner_leg_angle_y(hypotenuse, leg); let result = inner_leg_angle_y(hypotenuse, leg);
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)

View File

@ -11,9 +11,9 @@ use crate::{
}; };
/// Returns the segment end of x. /// Returns the segment end of x.
pub fn segment_end_x(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn segment_end_x(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?; let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?;
let result = inner_segment_end_x(&segment_name, sketch_group, args)?; let result = inner_segment_end_x(&segment_name, sketch_group, args.clone())?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
} }
@ -22,7 +22,7 @@ pub fn segment_end_x(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "segEndX", name = "segEndX",
}] }]
fn inner_segment_end_x(segment_name: &str, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> { fn inner_segment_end_x(segment_name: &str, sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let line = sketch_group.get_base_by_name_or_start(segment_name).ok_or_else(|| { let line = sketch_group.get_base_by_name_or_start(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails { KclError::Type(KclErrorDetails {
message: format!( message: format!(
@ -37,9 +37,9 @@ fn inner_segment_end_x(segment_name: &str, sketch_group: Box<SketchGroup>, args:
} }
/// Returns the segment end of y. /// Returns the segment end of y.
pub fn segment_end_y(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn segment_end_y(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?; let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?;
let result = inner_segment_end_y(&segment_name, sketch_group, args)?; let result = inner_segment_end_y(&segment_name, sketch_group, args.clone())?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
} }
@ -48,7 +48,7 @@ pub fn segment_end_y(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "segEndY", name = "segEndY",
}] }]
fn inner_segment_end_y(segment_name: &str, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> { fn inner_segment_end_y(segment_name: &str, sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let line = sketch_group.get_base_by_name_or_start(segment_name).ok_or_else(|| { let line = sketch_group.get_base_by_name_or_start(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails { KclError::Type(KclErrorDetails {
message: format!( message: format!(
@ -63,9 +63,9 @@ fn inner_segment_end_y(segment_name: &str, sketch_group: Box<SketchGroup>, args:
} }
/// Returns the last segment of x. /// Returns the last segment of x.
pub fn last_segment_x(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn last_segment_x(args: Args) -> Result<MemoryItem, KclError> {
let sketch_group = args.get_sketch_group()?; let sketch_group = args.get_sketch_group()?;
let result = inner_last_segment_x(sketch_group, args)?; let result = inner_last_segment_x(sketch_group, args.clone())?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
} }
@ -74,7 +74,7 @@ pub fn last_segment_x(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "lastSegX", name = "lastSegX",
}] }]
fn inner_last_segment_x(sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> { fn inner_last_segment_x(sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let last_line = sketch_group let last_line = sketch_group
.value .value
.last() .last()
@ -93,9 +93,9 @@ fn inner_last_segment_x(sketch_group: Box<SketchGroup>, args: &mut Args) -> Resu
} }
/// Returns the last segment of y. /// Returns the last segment of y.
pub fn last_segment_y(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn last_segment_y(args: Args) -> Result<MemoryItem, KclError> {
let sketch_group = args.get_sketch_group()?; let sketch_group = args.get_sketch_group()?;
let result = inner_last_segment_y(sketch_group, args)?; let result = inner_last_segment_y(sketch_group, args.clone())?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
} }
@ -104,7 +104,7 @@ pub fn last_segment_y(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "lastSegY", name = "lastSegY",
}] }]
fn inner_last_segment_y(sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> { fn inner_last_segment_y(sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let last_line = sketch_group let last_line = sketch_group
.value .value
.last() .last()
@ -123,9 +123,9 @@ fn inner_last_segment_y(sketch_group: Box<SketchGroup>, args: &mut Args) -> Resu
} }
/// Returns the length of the segment. /// Returns the length of the segment.
pub fn segment_length(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn segment_length(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?; let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?;
let result = inner_segment_length(&segment_name, sketch_group, args)?; let result = inner_segment_length(&segment_name, sketch_group, args.clone())?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
} }
@ -133,7 +133,7 @@ pub fn segment_length(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "segLen", name = "segLen",
}] }]
fn inner_segment_length(segment_name: &str, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> { fn inner_segment_length(segment_name: &str, sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| { let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails { KclError::Type(KclErrorDetails {
message: format!( message: format!(
@ -151,10 +151,10 @@ fn inner_segment_length(segment_name: &str, sketch_group: Box<SketchGroup>, args
} }
/// Returns the angle of the segment. /// Returns the angle of the segment.
pub fn segment_angle(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn segment_angle(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?; let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?;
let result = inner_segment_angle(&segment_name, sketch_group, args)?; let result = inner_segment_angle(&segment_name, sketch_group, args.clone())?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
} }
@ -162,7 +162,7 @@ pub fn segment_angle(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "segAng", name = "segAng",
}] }]
fn inner_segment_angle(segment_name: &str, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> { fn inner_segment_angle(segment_name: &str, sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| { let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails { KclError::Type(KclErrorDetails {
message: format!( message: format!(
@ -180,9 +180,9 @@ fn inner_segment_angle(segment_name: &str, sketch_group: Box<SketchGroup>, args:
} }
/// Returns the angle to match the given length for x. /// Returns the angle to match the given length for x.
pub fn angle_to_match_length_x(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn angle_to_match_length_x(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, to, sketch_group) = args.get_segment_name_to_number_sketch_group()?; let (segment_name, to, sketch_group) = args.get_segment_name_to_number_sketch_group()?;
let result = inner_angle_to_match_length_x(&segment_name, to, sketch_group, args)?; let result = inner_angle_to_match_length_x(&segment_name, to, sketch_group, args.clone())?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
} }
@ -194,7 +194,7 @@ fn inner_angle_to_match_length_x(
segment_name: &str, segment_name: &str,
to: f64, to: f64,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<f64, KclError> { ) -> Result<f64, KclError> {
let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| { let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails { KclError::Type(KclErrorDetails {
@ -235,9 +235,9 @@ fn inner_angle_to_match_length_x(
} }
/// Returns the angle to match the given length for y. /// Returns the angle to match the given length for y.
pub fn angle_to_match_length_y(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn angle_to_match_length_y(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, to, sketch_group) = args.get_segment_name_to_number_sketch_group()?; let (segment_name, to, sketch_group) = args.get_segment_name_to_number_sketch_group()?;
let result = inner_angle_to_match_length_y(&segment_name, to, sketch_group, args)?; let result = inner_angle_to_match_length_y(&segment_name, to, sketch_group, args.clone())?;
args.make_user_val_from_f64(result) args.make_user_val_from_f64(result)
} }
@ -249,7 +249,7 @@ fn inner_angle_to_match_length_y(
segment_name: &str, segment_name: &str,
to: f64, to: f64,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<f64, KclError> { ) -> Result<f64, KclError> {
let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| { let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails { KclError::Type(KclErrorDetails {

View File

@ -33,10 +33,10 @@ pub enum LineToData {
} }
/// Draw a line to a point. /// Draw a line to a point.
pub fn line_to(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn line_to(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (LineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (LineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_line_to(data, sketch_group, args)?; let new_sketch_group = inner_line_to(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -44,10 +44,10 @@ pub fn line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "lineTo", name = "lineTo",
}] }]
fn inner_line_to( async fn inner_line_to(
data: LineToData, data: LineToData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
let to = match data { let to = match data {
@ -67,9 +67,11 @@ fn inner_line_to(
y: to[1], y: to[1],
z: 0.0, z: 0.0,
}, },
relative: false,
}, },
}, },
)?; )
.await?;
let current_path = Path::ToPoint { let current_path = Path::ToPoint {
base: BasePath { base: BasePath {
@ -110,10 +112,10 @@ pub enum AxisLineToData {
} }
/// Draw a line to a point on the x-axis. /// Draw a line to a point on the x-axis.
pub fn x_line_to(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn x_line_to(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AxisLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AxisLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_x_line_to(data, sketch_group, args)?; let new_sketch_group = inner_x_line_to(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -121,10 +123,10 @@ pub fn x_line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "xLineTo", name = "xLineTo",
}] }]
fn inner_x_line_to( async fn inner_x_line_to(
data: AxisLineToData, data: AxisLineToData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
@ -133,16 +135,16 @@ fn inner_x_line_to(
AxisLineToData::Point(data) => LineToData::Point([data, from.y]), AxisLineToData::Point(data) => LineToData::Point([data, from.y]),
}; };
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args)?; let new_sketch_group = inner_line_to(line_to_data, sketch_group, args).await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
/// Draw a line to a point on the y-axis. /// Draw a line to a point on the y-axis.
pub fn y_line_to(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn y_line_to(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AxisLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AxisLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_y_line_to(data, sketch_group, args)?; let new_sketch_group = inner_y_line_to(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -150,10 +152,10 @@ pub fn y_line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "yLineTo", name = "yLineTo",
}] }]
fn inner_y_line_to( async fn inner_y_line_to(
data: AxisLineToData, data: AxisLineToData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
@ -162,7 +164,7 @@ fn inner_y_line_to(
AxisLineToData::Point(data) => LineToData::Point([from.x, data]), AxisLineToData::Point(data) => LineToData::Point([from.x, data]),
}; };
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args)?; let new_sketch_group = inner_line_to(line_to_data, sketch_group, args).await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
@ -183,10 +185,10 @@ pub enum LineData {
} }
/// Draw a line. /// Draw a line.
pub fn line(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn line(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (LineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (LineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_line(data, sketch_group, args)?; let new_sketch_group = inner_line(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -194,13 +196,14 @@ pub fn line(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "line", name = "line",
}] }]
fn inner_line(data: LineData, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<SketchGroup>, KclError> { async fn inner_line(data: LineData, sketch_group: Box<SketchGroup>, args: Args) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
let inner_args = match &data { let inner_args = match &data {
LineData::PointWithTag { to, .. } => *to, LineData::PointWithTag { to, .. } => *to,
LineData::Point(to) => *to, LineData::Point(to) => *to,
}; };
let delta = inner_args;
let to = [from.x + inner_args[0], from.y + inner_args[1]]; let to = [from.x + inner_args[0], from.y + inner_args[1]];
let id = uuid::Uuid::new_v4(); let id = uuid::Uuid::new_v4();
@ -211,13 +214,15 @@ fn inner_line(data: LineData, sketch_group: Box<SketchGroup>, args: &mut Args) -
path: sketch_group.id, path: sketch_group.id,
segment: kittycad::types::PathSegment::Line { segment: kittycad::types::PathSegment::Line {
end: Point3D { end: Point3D {
x: to[0], x: delta[0],
y: to[1], y: delta[1],
z: 0.0, z: 0.0,
}, },
relative: true,
}, },
}, },
)?; )
.await?;
let current_path = Path::ToPoint { let current_path = Path::ToPoint {
base: BasePath { base: BasePath {
@ -258,10 +263,10 @@ pub enum AxisLineData {
} }
/// Draw a line on the x-axis. /// Draw a line on the x-axis.
pub fn x_line(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn x_line(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AxisLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AxisLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_x_line(data, sketch_group, args)?; let new_sketch_group = inner_x_line(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -269,25 +274,25 @@ pub fn x_line(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "xLine", name = "xLine",
}] }]
fn inner_x_line( async fn inner_x_line(
data: AxisLineData, data: AxisLineData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let line_data = match data { let line_data = match data {
AxisLineData::LengthWithTag { length, tag } => LineData::PointWithTag { to: [length, 0.0], tag }, AxisLineData::LengthWithTag { length, tag } => LineData::PointWithTag { to: [length, 0.0], tag },
AxisLineData::Length(length) => LineData::Point([length, 0.0]), AxisLineData::Length(length) => LineData::Point([length, 0.0]),
}; };
let new_sketch_group = inner_line(line_data, sketch_group, args)?; let new_sketch_group = inner_line(line_data, sketch_group, args).await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
/// Draw a line on the y-axis. /// Draw a line on the y-axis.
pub fn y_line(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn y_line(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AxisLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AxisLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_y_line(data, sketch_group, args)?; let new_sketch_group = inner_y_line(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -295,17 +300,17 @@ pub fn y_line(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "yLine", name = "yLine",
}] }]
fn inner_y_line( async fn inner_y_line(
data: AxisLineData, data: AxisLineData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let line_data = match data { let line_data = match data {
AxisLineData::LengthWithTag { length, tag } => LineData::PointWithTag { to: [0.0, length], tag }, AxisLineData::LengthWithTag { length, tag } => LineData::PointWithTag { to: [0.0, length], tag },
AxisLineData::Length(length) => LineData::Point([0.0, length]), AxisLineData::Length(length) => LineData::Point([0.0, length]),
}; };
let new_sketch_group = inner_line(line_data, sketch_group, args)?; let new_sketch_group = inner_line(line_data, sketch_group, args).await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
@ -328,10 +333,10 @@ pub enum AngledLineData {
} }
/// Draw an angled line. /// Draw an angled line.
pub fn angled_line(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn angled_line(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line(data, sketch_group, args)?; let new_sketch_group = inner_angled_line(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -339,20 +344,25 @@ pub fn angled_line(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "angledLine", name = "angledLine",
}] }]
fn inner_angled_line( async fn inner_angled_line(
data: AngledLineData, data: AngledLineData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
let (angle, length) = match &data { let (angle, length) = match &data {
AngledLineData::AngleWithTag { angle, length, .. } => (*angle, *length), AngledLineData::AngleWithTag { angle, length, .. } => (*angle, *length),
AngledLineData::AngleAndLength(angle_and_length) => (angle_and_length[0], angle_and_length[1]), AngledLineData::AngleAndLength(angle_and_length) => (angle_and_length[0], angle_and_length[1]),
}; };
let to: [f64; 2] = [
from.x + length * f64::cos(angle.to_radians()), //double check me on this one - mike
from.y + length * f64::sin(angle.to_radians()), let delta: [f64; 2] = [
length * f64::cos(angle.to_radians()),
length * f64::sin(angle.to_radians()),
]; ];
let relative = true;
let to: [f64; 2] = [from.x + delta[0], from.y + delta[1]];
let id = uuid::Uuid::new_v4(); let id = uuid::Uuid::new_v4();
@ -378,13 +388,15 @@ fn inner_angled_line(
path: sketch_group.id, path: sketch_group.id,
segment: kittycad::types::PathSegment::Line { segment: kittycad::types::PathSegment::Line {
end: Point3D { end: Point3D {
x: to[0], x: delta[0],
y: to[1], y: delta[1],
z: 0.0, z: 0.0,
}, },
relative,
}, },
}, },
)?; )
.await?;
let mut new_sketch_group = sketch_group.clone(); let mut new_sketch_group = sketch_group.clone();
new_sketch_group.value.push(current_path); new_sketch_group.value.push(current_path);
@ -392,10 +404,10 @@ fn inner_angled_line(
} }
/// Draw an angled line of a given x length. /// Draw an angled line of a given x length.
pub fn angled_line_of_x_length(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn angled_line_of_x_length(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_of_x_length(data, sketch_group, args)?; let new_sketch_group = inner_angled_line_of_x_length(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -403,10 +415,10 @@ pub fn angled_line_of_x_length(args: &mut Args) -> Result<MemoryItem, KclError>
#[stdlib { #[stdlib {
name = "angledLineOfXLength", name = "angledLineOfXLength",
}] }]
fn inner_angled_line_of_x_length( async fn inner_angled_line_of_x_length(
data: AngledLineData, data: AngledLineData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let (angle, length) = match &data { let (angle, length) = match &data {
AngledLineData::AngleWithTag { angle, length, .. } => (*angle, *length), AngledLineData::AngleWithTag { angle, length, .. } => (*angle, *length),
@ -423,7 +435,8 @@ fn inner_angled_line_of_x_length(
}, },
sketch_group, sketch_group,
args, args,
)?; )
.await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
@ -447,10 +460,10 @@ pub enum AngledLineToData {
} }
/// Draw an angled line to a given x coordinate. /// Draw an angled line to a given x coordinate.
pub fn angled_line_to_x(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn angled_line_to_x(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AngledLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_to_x(data, sketch_group, args)?; let new_sketch_group = inner_angled_line_to_x(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -458,10 +471,10 @@ pub fn angled_line_to_x(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "angledLineToX", name = "angledLineToX",
}] }]
fn inner_angled_line_to_x( async fn inner_angled_line_to_x(
data: AngledLineToData, data: AngledLineToData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
let (angle, x_to) = match &data { let (angle, x_to) = match &data {
@ -481,15 +494,16 @@ fn inner_angled_line_to_x(
}, },
sketch_group, sketch_group,
args, args,
)?; )
.await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
/// Draw an angled line of a given y length. /// Draw an angled line of a given y length.
pub fn angled_line_of_y_length(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn angled_line_of_y_length(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_of_y_length(data, sketch_group, args)?; let new_sketch_group = inner_angled_line_of_y_length(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -498,10 +512,10 @@ pub fn angled_line_of_y_length(args: &mut Args) -> Result<MemoryItem, KclError>
#[stdlib { #[stdlib {
name = "angledLineOfYLength", name = "angledLineOfYLength",
}] }]
fn inner_angled_line_of_y_length( async fn inner_angled_line_of_y_length(
data: AngledLineData, data: AngledLineData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let (angle, length) = match &data { let (angle, length) = match &data {
AngledLineData::AngleWithTag { angle, length, .. } => (*angle, *length), AngledLineData::AngleWithTag { angle, length, .. } => (*angle, *length),
@ -518,16 +532,17 @@ fn inner_angled_line_of_y_length(
}, },
sketch_group, sketch_group,
args, args,
)?; )
.await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
/// Draw an angled line to a given y coordinate. /// Draw an angled line to a given y coordinate.
pub fn angled_line_to_y(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn angled_line_to_y(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AngledLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_to_y(data, sketch_group, args)?; let new_sketch_group = inner_angled_line_to_y(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -535,10 +550,10 @@ pub fn angled_line_to_y(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "angledLineToY", name = "angledLineToY",
}] }]
fn inner_angled_line_to_y( async fn inner_angled_line_to_y(
data: AngledLineToData, data: AngledLineToData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
let (angle, y_to) = match &data { let (angle, y_to) = match &data {
@ -558,7 +573,8 @@ fn inner_angled_line_to_y(
}, },
sketch_group, sketch_group,
args, args,
)?; )
.await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
@ -579,9 +595,9 @@ pub struct AngeledLineThatIntersectsData {
} }
/// Draw an angled line that intersects with a given line. /// Draw an angled line that intersects with a given line.
pub fn angled_line_that_intersects(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn angled_line_that_intersects(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngeledLineThatIntersectsData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (AngeledLineThatIntersectsData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_that_intersects(data, sketch_group, args)?; let new_sketch_group = inner_angled_line_that_intersects(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -589,10 +605,10 @@ pub fn angled_line_that_intersects(args: &mut Args) -> Result<MemoryItem, KclErr
#[stdlib { #[stdlib {
name = "angledLineThatIntersects", name = "angledLineThatIntersects",
}] }]
fn inner_angled_line_that_intersects( async fn inner_angled_line_that_intersects(
data: AngeledLineThatIntersectsData, data: AngeledLineThatIntersectsData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let intersect_path = sketch_group let intersect_path = sketch_group
.get_path_by_name(&data.intersect_tag) .get_path_by_name(&data.intersect_tag)
@ -621,15 +637,15 @@ fn inner_angled_line_that_intersects(
LineToData::Point(to.into()) LineToData::Point(to.into())
}; };
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args)?; let new_sketch_group = inner_line_to(line_to_data, sketch_group, args).await?;
Ok(new_sketch_group) Ok(new_sketch_group)
} }
/// Start a sketch at a given point. /// Start a sketch at a given point.
pub fn start_sketch_at(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn start_sketch_at(args: Args) -> Result<MemoryItem, KclError> {
let data: LineData = args.get_data()?; let data: LineData = args.get_data()?;
let sketch_group = inner_start_sketch_at(data, args)?; let sketch_group = inner_start_sketch_at(data, args).await?;
Ok(MemoryItem::SketchGroup(sketch_group)) Ok(MemoryItem::SketchGroup(sketch_group))
} }
@ -637,7 +653,7 @@ pub fn start_sketch_at(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "startSketchAt", name = "startSketchAt",
}] }]
fn inner_start_sketch_at(data: LineData, args: &mut Args) -> Result<Box<SketchGroup>, KclError> { async fn inner_start_sketch_at(data: LineData, args: Args) -> Result<Box<SketchGroup>, KclError> {
let to = match &data { let to = match &data {
LineData::PointWithTag { to, .. } => *to, LineData::PointWithTag { to, .. } => *to,
LineData::Point(to) => *to, LineData::Point(to) => *to,
@ -646,7 +662,7 @@ fn inner_start_sketch_at(data: LineData, args: &mut Args) -> Result<Box<SketchGr
let id = uuid::Uuid::new_v4(); let id = uuid::Uuid::new_v4();
let path_id = uuid::Uuid::new_v4(); let path_id = uuid::Uuid::new_v4();
args.send_modeling_cmd(path_id, ModelingCmd::StartPath {})?; args.send_modeling_cmd(path_id, ModelingCmd::StartPath {}).await?;
args.send_modeling_cmd( args.send_modeling_cmd(
id, id,
ModelingCmd::MovePathPen { ModelingCmd::MovePathPen {
@ -657,7 +673,8 @@ fn inner_start_sketch_at(data: LineData, args: &mut Args) -> Result<Box<SketchGr
z: 0.0, z: 0.0,
}, },
}, },
)?; )
.await?;
let current_path = BasePath { let current_path = BasePath {
from: to, from: to,
@ -685,10 +702,10 @@ fn inner_start_sketch_at(data: LineData, args: &mut Args) -> Result<Box<SketchGr
} }
/// Close the current sketch. /// Close the current sketch.
pub fn close(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn close(args: Args) -> Result<MemoryItem, KclError> {
let sketch_group = args.get_sketch_group()?; let sketch_group = args.get_sketch_group()?;
let new_sketch_group = inner_close(sketch_group, args)?; let new_sketch_group = inner_close(sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -697,7 +714,7 @@ pub fn close(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "close", name = "close",
}] }]
fn inner_close(sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<SketchGroup>, KclError> { async fn inner_close(sketch_group: Box<SketchGroup>, args: Args) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
let to: Point2d = sketch_group.start.from.into(); let to: Point2d = sketch_group.start.from.into();
@ -708,7 +725,8 @@ fn inner_close(sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<Sk
ModelingCmd::ClosePath { ModelingCmd::ClosePath {
path_id: sketch_group.id, path_id: sketch_group.id,
}, },
)?; )
.await?;
let mut new_sketch_group = sketch_group.clone(); let mut new_sketch_group = sketch_group.clone();
new_sketch_group.value.push(Path::ToPoint { new_sketch_group.value.push(Path::ToPoint {
@ -775,10 +793,10 @@ pub enum ArcData {
} }
/// Draw an arc. /// Draw an arc.
pub fn arc(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn arc(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (ArcData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (ArcData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_arc(data, sketch_group, args)?; let new_sketch_group = inner_arc(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -786,7 +804,7 @@ pub fn arc(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "arc", name = "arc",
}] }]
fn inner_arc(data: ArcData, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<SketchGroup>, KclError> { async fn inner_arc(data: ArcData, sketch_group: Box<SketchGroup>, args: Args) -> Result<Box<SketchGroup>, KclError> {
let from: Point2d = sketch_group.get_coords_from_paths()?; let from: Point2d = sketch_group.get_coords_from_paths()?;
let (center, angle_start, angle_end, radius, end) = match &data { let (center, angle_start, angle_end, radius, end) = match &data {
@ -832,24 +850,11 @@ fn inner_arc(data: ArcData, sketch_group: Box<SketchGroup>, args: &mut Args) ->
angle_end: angle_end.degrees(), angle_end: angle_end.degrees(),
center: center.into(), center: center.into(),
radius, radius,
relative: false,
}, },
}, },
)?; )
// Move the path pen to the end of the arc. .await?;
// Since that is where we want to draw the next path.
// TODO: the engine should automatically move the pen to the end of the arc.
// This just seems inefficient.
args.send_modeling_cmd(
id,
ModelingCmd::MovePathPen {
path: sketch_group.id,
to: Point3D {
x: end.x,
y: end.y,
z: 0.0,
},
},
)?;
let current_path = Path::ToPoint { let current_path = Path::ToPoint {
base: BasePath { base: BasePath {
@ -902,10 +907,10 @@ pub enum BezierData {
} }
/// Draw a bezier curve. /// Draw a bezier curve.
pub fn bezier_curve(args: &mut Args) -> Result<MemoryItem, KclError> { pub async fn bezier_curve(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (BezierData, Box<SketchGroup>) = args.get_data_and_sketch_group()?; let (data, sketch_group): (BezierData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_bezier_curve(data, sketch_group, args)?; let new_sketch_group = inner_bezier_curve(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group)) Ok(MemoryItem::SketchGroup(new_sketch_group))
} }
@ -913,10 +918,10 @@ pub fn bezier_curve(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib { #[stdlib {
name = "bezierCurve", name = "bezierCurve",
}] }]
fn inner_bezier_curve( async fn inner_bezier_curve(
data: BezierData, data: BezierData,
sketch_group: Box<SketchGroup>, sketch_group: Box<SketchGroup>,
args: &mut Args, args: Args,
) -> Result<Box<SketchGroup>, KclError> { ) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?; let from = sketch_group.get_coords_from_paths()?;
@ -927,6 +932,8 @@ fn inner_bezier_curve(
BezierData::Points { to, control1, control2 } => (to, control1, control2), BezierData::Points { to, control1, control2 } => (to, control1, control2),
}; };
let relative = true;
let delta = to;
let to = [from.x + to[0], from.y + to[1]]; let to = [from.x + to[0], from.y + to[1]];
let id = uuid::Uuid::new_v4(); let id = uuid::Uuid::new_v4();
@ -937,23 +944,25 @@ fn inner_bezier_curve(
path: sketch_group.id, path: sketch_group.id,
segment: kittycad::types::PathSegment::Bezier { segment: kittycad::types::PathSegment::Bezier {
control1: Point3D { control1: Point3D {
x: from.x + control1[0], x: control1[0],
y: from.y + control1[1], y: control1[1],
z: 0.0, z: 0.0,
}, },
control2: Point3D { control2: Point3D {
x: from.x + control2[0], x: control2[0],
y: from.y + control2[1], y: control2[1],
z: 0.0, z: 0.0,
}, },
end: Point3D { end: Point3D {
x: to[0], x: delta[0],
y: to[1], y: delta[1],
z: 0.0, z: 0.0,
}, },
relative,
}, },
}, },
)?; )
.await?;
let current_path = Path::ToPoint { let current_path = Path::ToPoint {
base: BasePath { base: BasePath {

View File

@ -0,0 +1,171 @@
use std::str::FromStr;
use anyhow::Result;
use parse_display::{Display, FromStr};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use tower_lsp::lsp_types::SemanticTokenType;
mod tokeniser;
/// The types of tokens.
#[derive(Debug, PartialEq, Eq, Copy, Clone, Deserialize, Serialize, JsonSchema, FromStr, Display)]
#[serde(rename_all = "camelCase")]
#[display(style = "camelCase")]
pub enum TokenType {
/// A number.
Number,
/// A word.
Word,
/// An operator.
Operator,
/// A string.
String,
/// A keyword.
Keyword,
/// A brace.
Brace,
/// Whitespace.
Whitespace,
/// A comma.
Comma,
/// A colon.
Colon,
/// A period.
Period,
/// A double period: `..`.
DoublePeriod,
/// A line comment.
LineComment,
/// A block comment.
BlockComment,
/// A function name.
Function,
}
/// Most KCL tokens correspond to LSP semantic tokens (but not all).
impl TryFrom<TokenType> for SemanticTokenType {
type Error = anyhow::Error;
fn try_from(token_type: TokenType) -> Result<Self> {
Ok(match token_type {
TokenType::Number => Self::NUMBER,
TokenType::Word => Self::VARIABLE,
TokenType::Keyword => Self::KEYWORD,
TokenType::Operator => Self::OPERATOR,
TokenType::String => Self::STRING,
TokenType::LineComment => Self::COMMENT,
TokenType::BlockComment => Self::COMMENT,
TokenType::Function => Self::FUNCTION,
TokenType::Whitespace
| TokenType::Brace
| TokenType::Comma
| TokenType::Colon
| TokenType::Period
| TokenType::DoublePeriod => {
anyhow::bail!("unsupported token type: {:?}", token_type)
}
})
}
}
impl TokenType {
// This is for the lsp server.
pub fn all_semantic_token_types() -> Result<Vec<SemanticTokenType>> {
let mut settings = schemars::gen::SchemaSettings::openapi3();
settings.inline_subschemas = true;
let mut generator = schemars::gen::SchemaGenerator::new(settings);
let schema = TokenType::json_schema(&mut generator);
let schemars::schema::Schema::Object(o) = &schema else {
anyhow::bail!("expected object schema: {:#?}", schema);
};
let Some(subschemas) = &o.subschemas else {
anyhow::bail!("expected subschemas: {:#?}", schema);
};
let Some(one_ofs) = &subschemas.one_of else {
anyhow::bail!("expected one_of: {:#?}", schema);
};
let mut semantic_tokens = vec![];
for one_of in one_ofs {
let schemars::schema::Schema::Object(o) = one_of else {
anyhow::bail!("expected object one_of: {:#?}", one_of);
};
let Some(enum_values) = o.enum_values.as_ref() else {
anyhow::bail!("expected enum values: {:#?}", o);
};
if enum_values.len() > 1 {
anyhow::bail!("expected only one enum value: {:#?}", o);
}
if enum_values.is_empty() {
anyhow::bail!("expected at least one enum value: {:#?}", o);
}
let label = TokenType::from_str(&enum_values[0].to_string().replace('"', ""))?;
if let Ok(semantic_token_type) = SemanticTokenType::try_from(label) {
semantic_tokens.push(semantic_token_type);
}
}
Ok(semantic_tokens)
}
}
#[derive(Debug, PartialEq, Eq, Deserialize, Serialize, Clone)]
pub struct Token {
#[serde(rename = "type")]
pub token_type: TokenType,
/// Offset in the source code where this token begins.
pub start: usize,
/// Offset in the source code where this token ends.
pub end: usize,
pub value: String,
}
impl Token {
pub fn from_range(range: std::ops::Range<usize>, token_type: TokenType, value: String) -> Self {
Self {
start: range.start,
end: range.end,
value,
token_type,
}
}
pub fn is_code_token(&self) -> bool {
!matches!(
self.token_type,
TokenType::Whitespace | TokenType::LineComment | TokenType::BlockComment
)
}
}
impl From<Token> for crate::executor::SourceRange {
fn from(token: Token) -> Self {
Self([token.start, token.end])
}
}
impl From<&Token> for crate::executor::SourceRange {
fn from(token: &Token) -> Self {
Self([token.start, token.end])
}
}
pub fn lexer(s: &str) -> Vec<Token> {
tokeniser::lexer(s).unwrap_or_default()
}
#[cfg(test)]
mod tests {
use super::*;
// We have this as a test so we can ensure it never panics with an unwrap in the server.
#[test]
fn test_token_type_to_semantic_token_type() {
let semantic_types = TokenType::all_semantic_token_types().unwrap();
assert!(!semantic_types.is_empty());
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,744 +0,0 @@
use std::str::FromStr;
use anyhow::Result;
use lazy_static::lazy_static;
use parse_display::{Display, FromStr};
use regex::Regex;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use tower_lsp::lsp_types::SemanticTokenType;
/// The types of tokens.
#[derive(Debug, PartialEq, Eq, Copy, Clone, Deserialize, Serialize, ts_rs::TS, JsonSchema, FromStr, Display)]
#[ts(export)]
#[serde(rename_all = "camelCase")]
#[display(style = "camelCase")]
pub enum TokenType {
/// A number.
Number,
/// A word.
Word,
/// An operator.
Operator,
/// A string.
String,
/// A keyword.
Keyword,
/// A brace.
Brace,
/// Whitespace.
Whitespace,
/// A comma.
Comma,
/// A colon.
Colon,
/// A period.
Period,
/// A double period: `..`.
DoublePeriod,
/// A line comment.
LineComment,
/// A block comment.
BlockComment,
/// A function name.
Function,
}
impl TryFrom<TokenType> for SemanticTokenType {
type Error = anyhow::Error;
fn try_from(token_type: TokenType) -> Result<Self> {
Ok(match token_type {
TokenType::Number => Self::NUMBER,
TokenType::Word => Self::VARIABLE,
TokenType::Keyword => Self::KEYWORD,
TokenType::Operator => Self::OPERATOR,
TokenType::String => Self::STRING,
TokenType::LineComment => Self::COMMENT,
TokenType::BlockComment => Self::COMMENT,
TokenType::Function => Self::FUNCTION,
TokenType::Whitespace
| TokenType::Brace
| TokenType::Comma
| TokenType::Colon
| TokenType::Period
| TokenType::DoublePeriod => {
anyhow::bail!("unsupported token type: {:?}", token_type)
}
})
}
}
impl TokenType {
// This is for the lsp server.
pub fn to_semantic_token_types() -> Result<Vec<SemanticTokenType>> {
let mut settings = schemars::gen::SchemaSettings::openapi3();
settings.inline_subschemas = true;
let mut generator = schemars::gen::SchemaGenerator::new(settings);
let schema = TokenType::json_schema(&mut generator);
let schemars::schema::Schema::Object(o) = &schema else {
anyhow::bail!("expected object schema: {:#?}", schema);
};
let Some(subschemas) = &o.subschemas else {
anyhow::bail!("expected subschemas: {:#?}", schema);
};
let Some(one_ofs) = &subschemas.one_of else {
anyhow::bail!("expected one_of: {:#?}", schema);
};
let mut semantic_tokens = vec![];
for one_of in one_ofs {
let schemars::schema::Schema::Object(o) = one_of else {
anyhow::bail!("expected object one_of: {:#?}", one_of);
};
let Some(enum_values) = o.enum_values.as_ref() else {
anyhow::bail!("expected enum values: {:#?}", o);
};
if enum_values.len() > 1 {
anyhow::bail!("expected only one enum value: {:#?}", o);
}
if enum_values.is_empty() {
anyhow::bail!("expected at least one enum value: {:#?}", o);
}
let label = TokenType::from_str(&enum_values[0].to_string().replace('"', ""))?;
if let Ok(semantic_token_type) = SemanticTokenType::try_from(label) {
semantic_tokens.push(semantic_token_type);
}
}
Ok(semantic_tokens)
}
}
#[derive(Debug, PartialEq, Eq, Deserialize, Serialize, Clone, ts_rs::TS)]
#[ts(export)]
pub struct Token {
#[serde(rename = "type")]
pub token_type: TokenType,
pub start: usize,
pub end: usize,
pub value: String,
}
impl From<Token> for crate::executor::SourceRange {
fn from(token: Token) -> Self {
Self([token.start, token.end])
}
}
impl From<&Token> for crate::executor::SourceRange {
fn from(token: &Token) -> Self {
Self([token.start, token.end])
}
}
lazy_static! {
static ref NUMBER: Regex = Regex::new(r"^(\d+(\.\d*)?|\.\d+)\b").unwrap();
static ref WHITESPACE: Regex = Regex::new(r"\s+").unwrap();
static ref WORD: Regex = Regex::new(r"^[a-zA-Z_][a-zA-Z0-9_]*").unwrap();
// TODO: these should be generated using our struct types for these.
static ref KEYWORD: Regex =
Regex::new(r"^(if|else|for|while|return|break|continue|fn|let|mut|loop|true|false|nil|and|or|not|var|const)\b").unwrap();
static ref OPERATOR: Regex = Regex::new(r"^(>=|<=|==|=>|!= |\|>|\*|\+|-|/|%|=|<|>|\||\^)").unwrap();
static ref STRING: Regex = Regex::new(r#"^"([^"\\]|\\.)*"|'([^'\\]|\\.)*'"#).unwrap();
static ref BLOCK_START: Regex = Regex::new(r"^\{").unwrap();
static ref BLOCK_END: Regex = Regex::new(r"^\}").unwrap();
static ref PARAN_START: Regex = Regex::new(r"^\(").unwrap();
static ref PARAN_END: Regex = Regex::new(r"^\)").unwrap();
static ref ARRAY_START: Regex = Regex::new(r"^\[").unwrap();
static ref ARRAY_END: Regex = Regex::new(r"^\]").unwrap();
static ref COMMA: Regex = Regex::new(r"^,").unwrap();
static ref COLON: Regex = Regex::new(r"^:").unwrap();
static ref PERIOD: Regex = Regex::new(r"^\.").unwrap();
static ref DOUBLE_PERIOD: Regex = Regex::new(r"^\.\.").unwrap();
static ref LINECOMMENT: Regex = Regex::new(r"^//.*").unwrap();
static ref BLOCKCOMMENT: Regex = Regex::new(r"^/\*[\s\S]*?\*/").unwrap();
}
fn is_number(character: &str) -> bool {
NUMBER.is_match(character)
}
fn is_whitespace(character: &str) -> bool {
WHITESPACE.is_match(character)
}
fn is_word(character: &str) -> bool {
WORD.is_match(character)
}
fn is_keyword(character: &str) -> bool {
KEYWORD.is_match(character)
}
fn is_string(character: &str) -> bool {
match STRING.find(character) {
Some(m) => m.start() == 0,
None => false,
}
}
fn is_operator(character: &str) -> bool {
OPERATOR.is_match(character)
}
fn is_block_start(character: &str) -> bool {
BLOCK_START.is_match(character)
}
fn is_block_end(character: &str) -> bool {
BLOCK_END.is_match(character)
}
fn is_paran_start(character: &str) -> bool {
PARAN_START.is_match(character)
}
fn is_paran_end(character: &str) -> bool {
PARAN_END.is_match(character)
}
fn is_array_start(character: &str) -> bool {
ARRAY_START.is_match(character)
}
fn is_array_end(character: &str) -> bool {
ARRAY_END.is_match(character)
}
fn is_comma(character: &str) -> bool {
COMMA.is_match(character)
}
fn is_colon(character: &str) -> bool {
COLON.is_match(character)
}
fn is_double_period(character: &str) -> bool {
DOUBLE_PERIOD.is_match(character)
}
fn is_period(character: &str) -> bool {
PERIOD.is_match(character)
}
fn is_line_comment(character: &str) -> bool {
LINECOMMENT.is_match(character)
}
fn is_block_comment(character: &str) -> bool {
BLOCKCOMMENT.is_match(character)
}
fn match_first(s: &str, regex: &Regex) -> Option<String> {
regex.find(s).map(|the_match| the_match.as_str().to_string())
}
fn make_token(token_type: TokenType, value: &str, start: usize) -> Token {
Token {
token_type,
value: value.to_string(),
start,
end: start + value.len(),
}
}
fn return_token_at_index(s: &str, start_index: usize) -> Option<Token> {
let str_from_index = &s.chars().skip(start_index).collect::<String>();
if is_string(str_from_index) {
return Some(make_token(
TokenType::String,
&match_first(str_from_index, &STRING)?,
start_index,
));
}
let is_line_comment_bool = is_line_comment(str_from_index);
if is_line_comment_bool || is_block_comment(str_from_index) {
return Some(make_token(
if is_line_comment_bool {
TokenType::LineComment
} else {
TokenType::BlockComment
},
&match_first(
str_from_index,
if is_line_comment_bool {
&LINECOMMENT
} else {
&BLOCKCOMMENT
},
)?,
start_index,
));
}
if is_paran_end(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &PARAN_END)?,
start_index,
));
}
if is_paran_start(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &PARAN_START)?,
start_index,
));
}
if is_block_start(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &BLOCK_START)?,
start_index,
));
}
if is_block_end(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &BLOCK_END)?,
start_index,
));
}
if is_array_start(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &ARRAY_START)?,
start_index,
));
}
if is_array_end(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &ARRAY_END)?,
start_index,
));
}
if is_comma(str_from_index) {
return Some(make_token(
TokenType::Comma,
&match_first(str_from_index, &COMMA)?,
start_index,
));
}
if is_operator(str_from_index) {
return Some(make_token(
TokenType::Operator,
&match_first(str_from_index, &OPERATOR)?,
start_index,
));
}
if is_number(str_from_index) {
return Some(make_token(
TokenType::Number,
&match_first(str_from_index, &NUMBER)?,
start_index,
));
}
if is_keyword(str_from_index) {
return Some(make_token(
TokenType::Keyword,
&match_first(str_from_index, &KEYWORD)?,
start_index,
));
}
if is_word(str_from_index) {
return Some(make_token(
TokenType::Word,
&match_first(str_from_index, &WORD)?,
start_index,
));
}
if is_colon(str_from_index) {
return Some(make_token(
TokenType::Colon,
&match_first(str_from_index, &COLON)?,
start_index,
));
}
if is_double_period(str_from_index) {
return Some(make_token(
TokenType::DoublePeriod,
&match_first(str_from_index, &DOUBLE_PERIOD)?,
start_index,
));
}
if is_period(str_from_index) {
return Some(make_token(
TokenType::Period,
&match_first(str_from_index, &PERIOD)?,
start_index,
));
}
if is_whitespace(str_from_index) {
return Some(make_token(
TokenType::Whitespace,
&match_first(str_from_index, &WHITESPACE)?,
start_index,
));
}
None
}
fn recursively_tokenise(s: &str, current_index: usize, previous_tokens: Vec<Token>) -> Vec<Token> {
if current_index >= s.len() {
return previous_tokens;
}
let token = return_token_at_index(s, current_index);
let Some(token) = token else {
return recursively_tokenise(s, current_index + 1, previous_tokens);
};
let mut new_tokens = previous_tokens;
let token_length = token.value.len();
new_tokens.push(token);
recursively_tokenise(s, current_index + token_length, new_tokens)
}
pub fn lexer(s: &str) -> Vec<Token> {
recursively_tokenise(s, 0, Vec::new())
}
#[cfg(test)]
mod tests {
use pretty_assertions::assert_eq;
use super::*;
#[test]
fn is_number_test() {
assert!(is_number("1"));
assert!(is_number("1 abc"));
assert!(is_number("1.1"));
assert!(is_number("1.1 abc"));
assert!(!is_number("a"));
assert!(is_number("1"));
assert!(is_number(".1"));
assert!(is_number("5?"));
assert!(is_number("5 + 6"));
assert!(is_number("5 + a"));
assert!(is_number("5.5"));
assert!(!is_number("1abc"));
assert!(!is_number("a"));
assert!(!is_number("?"));
assert!(!is_number("?5"));
}
#[test]
fn is_whitespace_test() {
assert!(is_whitespace(" "));
assert!(is_whitespace(" "));
assert!(is_whitespace(" a"));
assert!(is_whitespace("a "));
assert!(!is_whitespace("a"));
assert!(!is_whitespace("?"));
}
#[test]
fn is_word_test() {
assert!(is_word("a"));
assert!(is_word("a "));
assert!(is_word("a5"));
assert!(is_word("a5a"));
assert!(!is_word("5"));
assert!(!is_word("5a"));
assert!(!is_word("5a5"));
}
#[test]
fn is_string_test() {
assert!(is_string("\"\""));
assert!(is_string("\"a\""));
assert!(is_string("\"a\" "));
assert!(is_string("\"a\"5"));
assert!(is_string("'a'5"));
assert!(is_string("\"with escaped \\\" backslash\""));
assert!(!is_string("\""));
assert!(!is_string("\"a"));
assert!(!is_string("a\""));
assert!(!is_string(" \"a\""));
assert!(!is_string("5\"a\""));
assert!(!is_string("a + 'str'"));
assert!(is_string("'c'"));
}
#[test]
fn is_operator_test() {
assert!(is_operator("+"));
assert!(is_operator("+ "));
assert!(is_operator("-"));
assert!(is_operator("<="));
assert!(is_operator("<= "));
assert!(is_operator(">="));
assert!(is_operator(">= "));
assert!(is_operator("> "));
assert!(is_operator("< "));
assert!(is_operator("| "));
assert!(is_operator("|> "));
assert!(is_operator("^ "));
assert!(is_operator("% "));
assert!(is_operator("+* "));
assert!(!is_operator("5 + 5"));
assert!(!is_operator("a"));
assert!(!is_operator("a+"));
assert!(!is_operator("a+5"));
assert!(!is_operator("5a+5"));
assert!(!is_operator(", newVar"));
assert!(!is_operator(","));
}
#[test]
fn is_block_start_test() {
assert!(is_block_start("{"));
assert!(is_block_start("{ "));
assert!(is_block_start("{5"));
assert!(is_block_start("{a"));
assert!(is_block_start("{5 "));
assert!(!is_block_start("5"));
assert!(!is_block_start("5 + 5"));
assert!(!is_block_start("5{ + 5"));
assert!(!is_block_start("a{ + 5"));
assert!(!is_block_start(" { + 5"));
}
#[test]
fn is_block_end_test() {
assert!(is_block_end("}"));
assert!(is_block_end("} "));
assert!(is_block_end("}5"));
assert!(is_block_end("}5 "));
assert!(!is_block_end("5"));
assert!(!is_block_end("5 + 5"));
assert!(!is_block_end("5} + 5"));
assert!(!is_block_end(" } + 5"));
}
#[test]
fn is_paran_start_test() {
assert!(is_paran_start("("));
assert!(is_paran_start("( "));
assert!(is_paran_start("(5"));
assert!(is_paran_start("(5 "));
assert!(is_paran_start("(5 + 5"));
assert!(is_paran_start("(5 + 5)"));
assert!(is_paran_start("(5 + 5) "));
assert!(!is_paran_start("5"));
assert!(!is_paran_start("5 + 5"));
assert!(!is_paran_start("5( + 5)"));
assert!(!is_paran_start(" ( + 5)"));
}
#[test]
fn is_paran_end_test() {
assert!(is_paran_end(")"));
assert!(is_paran_end(") "));
assert!(is_paran_end(")5"));
assert!(is_paran_end(")5 "));
assert!(!is_paran_end("5"));
assert!(!is_paran_end("5 + 5"));
assert!(!is_paran_end("5) + 5"));
assert!(!is_paran_end(" ) + 5"));
}
#[test]
fn is_comma_test() {
assert!(is_comma(","));
assert!(is_comma(", "));
assert!(is_comma(",5"));
assert!(is_comma(",5 "));
assert!(!is_comma("5"));
assert!(!is_comma("5 + 5"));
assert!(!is_comma("5, + 5"));
assert!(!is_comma(" , + 5"));
}
#[test]
fn is_line_comment_test() {
assert!(is_line_comment("//"));
assert!(is_line_comment("// "));
assert!(is_line_comment("//5"));
assert!(is_line_comment("//5 "));
assert!(!is_line_comment("5"));
assert!(!is_line_comment("5 + 5"));
assert!(!is_line_comment("5// + 5"));
assert!(!is_line_comment(" // + 5"));
}
#[test]
fn is_block_comment_test() {
assert!(is_block_comment("/* */"));
assert!(is_block_comment("/***/"));
assert!(is_block_comment("/*5*/"));
assert!(is_block_comment("/*5 */"));
assert!(!is_block_comment("/*"));
assert!(!is_block_comment("5"));
assert!(!is_block_comment("5 + 5"));
assert!(!is_block_comment("5/* + 5"));
assert!(!is_block_comment(" /* + 5"));
assert!(!is_block_comment(
r#" /* and
here
*/
"#
));
}
#[test]
fn make_token_test() {
assert_eq!(
make_token(TokenType::Keyword, "const", 56),
Token {
token_type: TokenType::Keyword,
value: "const".to_string(),
start: 56,
end: 61,
}
);
}
#[test]
fn return_token_at_index_test() {
assert_eq!(
return_token_at_index("const", 0),
Some(Token {
token_type: TokenType::Keyword,
value: "const".to_string(),
start: 0,
end: 5,
})
);
assert_eq!(
return_token_at_index(" 4554", 2),
Some(Token {
token_type: TokenType::Number,
value: "4554".to_string(),
start: 2,
end: 6,
})
);
}
#[test]
fn lexer_test() {
assert_eq!(
lexer("const a=5"),
vec![
Token {
token_type: TokenType::Keyword,
value: "const".to_string(),
start: 0,
end: 5,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 5,
end: 6,
},
Token {
token_type: TokenType::Word,
value: "a".to_string(),
start: 6,
end: 7,
},
Token {
token_type: TokenType::Operator,
value: "=".to_string(),
start: 7,
end: 8,
},
Token {
token_type: TokenType::Number,
value: "5".to_string(),
start: 8,
end: 9,
},
]
);
assert_eq!(
lexer("54 + 22500 + 6"),
vec![
Token {
token_type: TokenType::Number,
value: "54".to_string(),
start: 0,
end: 2,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 2,
end: 3,
},
Token {
token_type: TokenType::Operator,
value: "+".to_string(),
start: 3,
end: 4,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 4,
end: 5,
},
Token {
token_type: TokenType::Number,
value: "22500".to_string(),
start: 5,
end: 10,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 10,
end: 11,
},
Token {
token_type: TokenType::Operator,
value: "+".to_string(),
start: 11,
end: 12,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 12,
end: 13,
},
Token {
token_type: TokenType::Number,
value: "6".to_string(),
start: 13,
end: 14,
},
]
);
}
// We have this as a test so we can ensure it never panics with an unwrap in the server.
#[test]
fn test_token_type_to_semantic_token_type() {
let semantic_types = TokenType::to_semantic_token_types().unwrap();
assert!(!semantic_types.is_empty());
}
#[test]
fn test_lexer_negative_word() {
assert_eq!(
lexer("-legX"),
vec![
Token {
token_type: TokenType::Operator,
value: "-".to_string(),
start: 0,
end: 1,
},
Token {
token_type: TokenType::Word,
value: "legX".to_string(),
start: 1,
end: 5,
},
]
);
}
}

View File

@ -21,11 +21,12 @@ pub async fn execute_wasm(
let program: kcl_lib::ast::types::Program = serde_json::from_str(program_str).map_err(|e| e.to_string())?; let program: kcl_lib::ast::types::Program = serde_json::from_str(program_str).map_err(|e| e.to_string())?;
let mut mem: kcl_lib::executor::ProgramMemory = serde_json::from_str(memory_str).map_err(|e| e.to_string())?; let mut mem: kcl_lib::executor::ProgramMemory = serde_json::from_str(memory_str).map_err(|e| e.to_string())?;
let mut engine = kcl_lib::engine::EngineConnection::new(manager) let engine = kcl_lib::engine::EngineConnection::new(manager)
.await .await
.map_err(|e| format!("{:?}", e))?; .map_err(|e| format!("{:?}", e))?;
let memory = kcl_lib::executor::execute(program, &mut mem, kcl_lib::executor::BodyType::Root, &mut engine) let memory = kcl_lib::executor::execute(program, &mut mem, kcl_lib::executor::BodyType::Root, &engine)
.await
.map_err(String::from)?; .map_err(String::from)?;
// The serde-wasm-bindgen does not work here because of weird HashMap issues so we use the // The serde-wasm-bindgen does not work here because of weird HashMap issues so we use the
// gloo-serialize crate instead. // gloo-serialize crate instead.
@ -83,13 +84,13 @@ pub fn deserialize_files(data: &[u8]) -> Result<JsValue, JsError> {
// test for this function and by extension lexer are done in javascript land src/lang/tokeniser.test.ts // test for this function and by extension lexer are done in javascript land src/lang/tokeniser.test.ts
#[wasm_bindgen] #[wasm_bindgen]
pub fn lexer_js(js: &str) -> Result<JsValue, JsError> { pub fn lexer_js(js: &str) -> Result<JsValue, JsError> {
let tokens = kcl_lib::tokeniser::lexer(js); let tokens = kcl_lib::token::lexer(js);
Ok(JsValue::from_serde(&tokens)?) Ok(JsValue::from_serde(&tokens)?)
} }
#[wasm_bindgen] #[wasm_bindgen]
pub fn parse_js(js: &str) -> Result<JsValue, String> { pub fn parse_js(js: &str) -> Result<JsValue, String> {
let tokens = kcl_lib::tokeniser::lexer(js); let tokens = kcl_lib::token::lexer(js);
let parser = kcl_lib::parser::Parser::new(tokens); let parser = kcl_lib::parser::Parser::new(tokens);
let program = parser.ast().map_err(String::from)?; let program = parser.ast().map_err(String::from)?;
// The serde-wasm-bindgen does not work here because of weird HashMap issues so we use the // The serde-wasm-bindgen does not work here because of weird HashMap issues so we use the
@ -148,7 +149,7 @@ pub async fn lsp_run(config: ServerConfig) -> Result<(), JsValue> {
let stdlib_signatures = get_signatures_from_stdlib(&stdlib).map_err(|e| e.to_string())?; let stdlib_signatures = get_signatures_from_stdlib(&stdlib).map_err(|e| e.to_string())?;
// We can unwrap here because we know the tokeniser is valid, since // We can unwrap here because we know the tokeniser is valid, since
// we have a test for it. // we have a test for it.
let token_types = kcl_lib::tokeniser::TokenType::to_semantic_token_types().unwrap(); let token_types = kcl_lib::token::TokenType::all_semantic_token_types().unwrap();
let (service, socket) = LspService::new(|client| Backend { let (service, socket) = LspService::new(|client| Backend {
client, client,

View File

@ -0,0 +1,310 @@
const svg = startSketchAt([0, 0])
|> lineTo([2.52, -26.04], %) // MoveAbsolute
|> lineTo([2.52, -25.2], %) // VerticalLineAbsolute
|> lineTo([0.84, -25.2], %) // HorizontalLineAbsolute
|> lineTo([0.84, -24.36], %) // VerticalLineAbsolute
|> lineTo([0, -24.36], %) // HorizontalLineAbsolute
|> lineTo([0, -6.72], %) // VerticalLineAbsolute
|> lineTo([0.84, -6.72], %) // HorizontalLineAbsolute
|> lineTo([0.84, -5.88], %) // VerticalLineAbsolute
|> lineTo([1.68, -5.88], %) // HorizontalLineAbsolute
|> lineTo([1.68, -5.04], %) // VerticalLineAbsolute
|> lineTo([2.52, -5.04], %) // HorizontalLineAbsolute
|> lineTo([2.52, -4.2], %) // VerticalLineAbsolute
|> lineTo([3.36, -4.2], %) // HorizontalLineAbsolute
|> lineTo([3.36, -3.36], %) // VerticalLineAbsolute
|> lineTo([17.64, -3.36], %) // HorizontalLineAbsolute
|> lineTo([17.64, -4.2], %) // VerticalLineAbsolute
|> lineTo([18.48, -4.2], %) // HorizontalLineRelative
|> lineTo([18.48, -5.04], %) // VerticalLineHorizonal
|> lineTo([19.32, -5.04], %) // HorizontalLineRelative
|> lineTo([19.32, -5.88], %) // VerticalLineHorizonal
|> lineTo([20.16, -5.88], %) // HorizontalLineRelative
|> lineTo([20.16, -6.72], %) // VerticalLineAbsolute
|> lineTo([21, -6.72], %) // HorizontalLineAbsolute
|> lineTo([21, -24.36], %) // VerticalLineHorizonal
|> lineTo([20.16, -24.36], %) // HorizontalLineRelative
|> lineTo([20.16, -25.2], %) // VerticalLineHorizonal
|> lineTo([18.48, -25.2], %) // HorizontalLineRelative
|> lineTo([18.48, -26.04], %) // VerticalLineHorizonal
|> lineTo([15.96, -26.04], %) // HorizontalLineRelative
|> lineTo([15.96, -26.88], %) // VerticalLineHorizonal
|> lineTo([16.8, -26.88], %) // HorizontalLineRelative
|> lineTo([16.8, -28.56], %) // VerticalLineHorizonal
|> lineTo([11.76, -28.56], %) // HorizontalLineAbsolute
|> lineTo([11.76, -26.88], %) // VerticalLineAbsolute
|> lineTo([12.6, -26.88], %) // HorizontalLineAbsolute
|> lineTo([12.6, -26.04], %) // VerticalLineAbsolute
|> lineTo([8.4, -26.04], %) // HorizontalLineAbsolute
|> lineTo([8.4, -26.88], %) // VerticalLineHorizonal
|> lineTo([9.24, -26.88], %) // HorizontalLineRelative
|> lineTo([9.24, -28.56], %) // VerticalLineHorizonal
|> lineTo([4.2, -28.56], %) // HorizontalLineAbsolute
|> lineTo([4.2, -26.88], %) // VerticalLineHorizonal
|> lineTo([5.04, -26.88], %) // HorizontalLineRelative
|> lineTo([5.04, -26.04], %) // VerticalLineHorizonal
|> lineTo([0.839996, -20.58], %) // MoveRelative
|> lineTo([0.839996, -24.36], %) // VerticalLineHorizonal
|> lineTo([2.52, -24.36], %) // HorizontalLineAbsolute
|> lineTo([2.52, -25.2], %) // VerticalLineHorizonal
|> lineTo([18.48, -25.2], %) // HorizontalLineRelative
|> lineTo([18.48, -24.36], %) // VerticalLineHorizonal
|> lineTo([20.16, -24.36], %) // HorizontalLineRelative
|> lineTo([20.16, -20.58], %) // VerticalLineAbsolute
// StopAbsolute
|> lineTo([7.56, -24.36], %) // MoveAbsolute
|> lineTo([7.56, -22.68], %) // VerticalLineHorizonal
|> lineTo([13.44, -22.68], %) // HorizontalLineRelative
|> lineTo([13.44, -24.36], %) // VerticalLineHorizonal
|> lineTo([1.68, -22.68], %) // MoveRelative
|> lineTo([1.68, -21.84], %) // VerticalLineHorizonal
|> lineTo([5.88, -21.84], %) // HorizontalLineRelative
|> lineTo([5.88, -22.68], %) // VerticalLineHorizonal
|> lineTo([3.36, -24.36], %) // MoveRelative
|> lineTo([3.36, -23.52], %) // VerticalLineHorizonal
|> lineTo([5.88, -23.52], %) // HorizontalLineRelative
|> lineTo([5.88, -24.36], %) // VerticalLineHorizonal
|> lineTo([15.12, -22.68], %) // MoveRelative
|> lineTo([15.12, -21.84], %) // VerticalLineHorizonal
|> lineTo([15.959999999999999, -21.84], %) // HorizontalLineRelative
|> lineTo([15.959999999999999, -22.68], %) // VerticalLineHorizonal
|> lineTo([16.8, -22.68], %) // MoveRelative
|> lineTo([16.8, -21.84], %) // VerticalLineHorizonal
|> lineTo([17.64, -21.84], %) // HorizontalLineRelative
|> lineTo([17.64, -22.68], %) // VerticalLineHorizonal
|> lineTo([18.48, -22.68], %) // MoveRelative
|> lineTo([18.48, -21.84], %) // VerticalLineHorizonal
|> lineTo([19.32, -21.84], %) // HorizontalLineRelative
|> lineTo([19.32, -22.68], %) // VerticalLineHorizonal
|> lineTo([15.12, -24.36], %) // MoveRelative
|> lineTo([15.12, -23.52], %) // VerticalLineHorizonal
|> lineTo([17.64, -23.52], %) // HorizontalLineRelative
|> lineTo([17.64, -24.36], %) // VerticalLineHorizonal
|> lineTo([18.48, -5.88], %) // MoveAbsolute
|> lineTo([18.48, -5.04], %) // VerticalLineAbsolute
|> lineTo([17.64, -5.04], %) // HorizontalLineAbsolute
|> lineTo([17.64, -4.2], %) // VerticalLineAbsolute
|> lineTo([3.36, -4.2], %) // HorizontalLineAbsolute
|> lineTo([3.36, -5.04], %) // VerticalLineAbsolute
|> lineTo([2.52, -5.04], %) // HorizontalLineAbsolute
|> lineTo([2.52, -5.88], %) // VerticalLineAbsolute
|> lineTo([1.68, -5.88], %) // HorizontalLineAbsolute
|> lineTo([1.68, -6.72], %) // VerticalLineAbsolute
|> lineTo([0.839996, -6.72], %) // HorizontalLineAbsolute
|> lineTo([0.839996, -8.4], %) // VerticalLineAbsolute
|> lineTo([20.16, -8.4], %) // HorizontalLineAbsolute
|> lineTo([20.16, -6.72], %) // VerticalLineAbsolute
|> lineTo([19.32, -6.72], %) // HorizontalLineAbsolute
|> lineTo([19.32, -5.88], %) // VerticalLineAbsolute
|> lineTo([20.16, -7.56], %) // MoveAbsolute
|> lineTo([0.839996, -7.56], %) // HorizontalLineAbsolute
|> lineTo([0.839996, -19.32], %) // VerticalLineAbsolute
|> lineTo([20.16, -19.32], %) // HorizontalLineAbsolute
|> lineTo([3.36, -10.08], %) // MoveAbsolute
|> lineTo([3.36, -9.24001], %) // VerticalLineAbsolute
|> lineTo([17.64, -9.24001], %) // HorizontalLineAbsolute
|> lineTo([17.64, -10.08], %) // VerticalLineAbsolute
|> lineTo([18.48, -10.08], %) // HorizontalLineRelative
|> lineTo([18.48, -16.8], %) // VerticalLineHorizonal
|> lineTo([17.64, -16.8], %) // HorizontalLineRelative
|> lineTo([17.64, -17.64], %) // VerticalLineHorizonal
|> lineTo([3.36, -17.64], %) // HorizontalLineAbsolute
|> lineTo([3.36, -16.8], %) // VerticalLineAbsolute
|> lineTo([2.52, -16.8], %) // HorizontalLineAbsolute
|> lineTo([2.52, -10.080000000000002], %) // VerticalLineHorizonal
|> lineTo([13.44, -10.92], %) // MoveRelative
|> lineTo([13.44, -10.08], %) // VerticalLineHorizonal
|> lineTo([15.12, -10.08], %) // HorizontalLineRelative
|> lineTo([15.12, -13.44], %) // VerticalLineHorizonal
|> lineTo([14.28, -13.44], %) // HorizontalLineRelative
|> lineTo([9.24, -13.44], %) // MoveRelative
|> lineTo([11.76, -13.44], %) // HorizontalLineRelative
|> lineTo([11.76, -14.28], %) // VerticalLineHorizonal
|> lineTo([10.92, -14.28], %) // HorizontalLineRelative here
|> lineTo([10.92, -15.959999999999999], %) // VerticalLineHorizonal
|> lineTo([13.44, -15.959999999999999], %) // HorizontalLineRelative
|> lineTo([13.44, -15.12], %) // VerticalLineHorizonal
|> lineTo([14.28, -15.12], %) // HorizontalLineRelative
|> lineTo([14.28, -15.959999999999999], %) // VerticalLineHorizonal
|> lineTo([13.44, -15.959999999999999], %) // HorizontalLineAbsolute
|> lineTo([13.44, -16.8], %) // VerticalLineAbsolute
|> lineTo([7.56, -16.8], %) // HorizontalLineAbsolute
|> lineTo([7.56, -15.96], %) // VerticalLineAbsolute
|> lineTo([6.72, -15.96], %) // HorizontalLineAbsolute
|> lineTo([6.72, -15.120000000000001], %) // VerticalLineHorizonal
|> lineTo([7.56, -15.120000000000001], %) // HorizontalLineRelative
|> lineTo([7.56, -15.96], %) // VerticalLineHorizonal
|> lineTo([10.08, -15.96], %) // HorizontalLineRelative
|> lineTo([10.08, -14.28], %) // VerticalLineAbsolute
|> lineTo([9.24, -14.28], %) // HorizontalLineAbsolute
|> lineTo([7.56, -12.6], %) // MoveAbsolute
|> lineTo([7.56, -11.76], %) // VerticalLineAbsolute
|> lineTo([5.04, -11.76], %) // HorizontalLineAbsolute
|> lineTo([5.04, -12.6], %) // VerticalLineAbsolute
|> lineTo([4.2, -12.6], %) // HorizontalLineAbsolute
|> lineTo([4.2, -11.76], %) // VerticalLineHorizonal
|> lineTo([5.04, -11.76], %) // HorizontalLineRelative
|> lineTo([5.04, -10.92], %) // VerticalLineHorizonal
|> lineTo([7.5600000000000005, -10.92], %) // HorizontalLineRelative
|> lineTo([7.5600000000000005, -11.76], %) // VerticalLineHorizonal
|> lineTo([8.4, -11.76], %) // HorizontalLineAbsolute
|> lineTo([8.4, -12.6], %) // VerticalLineHorizonal
|> lineTo([3.36, -5.88], %) // MoveAbsolute
|> lineTo([3.36, -5.04], %) // VerticalLineAbsolute
|> lineTo([4.2, -5.04], %) // HorizontalLineAbsolute
|> lineTo([4.2, -3.36], %) // VerticalLineAbsolute
|> lineTo([5.04, -3.36], %) // HorizontalLineAbsolute
|> lineTo([5.04, -1.68], %) // VerticalLineAbsolute
|> lineTo([5.88, -1.68], %) // HorizontalLineAbsolute
|> lineTo([5.88, -0.83999599], %) // VerticalLineAbsolute
|> lineTo([6.72, -0.83999599], %) // HorizontalLineAbsolute
|> lineTo([6.72, -1.68], %) // VerticalLineAbsolute
|> lineTo([7.56, -1.68], %) // HorizontalLineAbsolute
|> lineTo([7.56, -3.36], %) // VerticalLineAbsolute
|> lineTo([8.4, -3.36], %) // HorizontalLineAbsolute
|> lineTo([8.4, -5.04], %) // VerticalLineHorizonal
|> lineTo([9.24, -5.04], %) // HorizontalLineRelative
|> lineTo([9.24, -5.88], %) // VerticalLineHorizonal
|> lineTo([17.64, -5.04], %) // MoveAbsolute
|> lineTo([17.64, -5.88], %) // VerticalLineAbsolute
|> lineTo([11.76, -5.88], %) // HorizontalLineAbsolute
|> lineTo([11.76, -5.04], %) // VerticalLineAbsolute
|> lineTo([12.6, -5.04], %) // HorizontalLineAbsolute
|> lineTo([12.6, -3.36], %) // VerticalLineAbsolute
|> lineTo([13.44, -3.36], %) // HorizontalLineRelative
|> lineTo([13.44, -1.68], %) // VerticalLineAbsolute
|> lineTo([14.28, -1.68], %) // HorizontalLineRelative
|> lineTo([14.28, -0.83999599], %) // VerticalLineAbsolute
|> lineTo([15.12, -0.83999599], %) // HorizontalLineRelative
|> lineTo([15.12, -1.68], %) // VerticalLineAbsolute
|> lineTo([15.959999999999999, -1.68], %) // HorizontalLineRelative
|> lineTo([15.959999999999999, -3.36], %) // VerticalLineHorizonal
|> lineTo([16.8, -3.36], %) // HorizontalLineRelative
|> lineTo([16.8, -5.04], %) // VerticalLineHorizonal
|> lineTo([13.44, -1.68], %) // MoveAbsolute
|> lineTo([13.44, -0], %) // VerticalLineAbsolute
|> lineTo([15.959999999999999, -0], %) // HorizontalLineRelative
|> lineTo([15.959999999999999, -1.68], %) // VerticalLineHorizonal
|> lineTo([16.8, -1.68], %) // HorizontalLineRelative
|> lineTo([16.8, -3.36], %) // VerticalLineHorizonal
|> lineTo([17.64, -3.36], %) // HorizontalLineRelative
|> lineTo([17.64, -4.62], %) // VerticalLineAbsolute
|> lineTo([16.8, -4.62], %) // HorizontalLineAbsolute
|> lineTo([16.8, -3.36], %) // VerticalLineAbsolute
|> lineTo([15.96, -3.36], %) // HorizontalLineAbsolute
|> lineTo([15.96, -1.68], %) // VerticalLineAbsolute
|> lineTo([15.12, -1.68], %) // HorizontalLineAbsolute
|> lineTo([15.12, -0.83999999], %) // VerticalLineAbsolute
|> lineTo([14.28, -0.83999999], %) // HorizontalLineAbsolute
|> lineTo([14.28, -1.68], %) // VerticalLineAbsolute
|> lineTo([13.44, -1.68], %) // HorizontalLineAbsolute
|> lineTo([13.44, -3.36], %) // VerticalLineAbsolute
|> lineTo([12.6, -3.36], %) // HorizontalLineAbsolute
|> lineTo([12.6, -4.62], %) // VerticalLineAbsolute
|> lineTo([11.76, -4.62], %) // HorizontalLineAbsolute
|> lineTo([11.76, -3.36], %) // VerticalLineAbsolute
|> lineTo([12.6, -3.36], %) // HorizontalLineAbsolute
|> lineTo([12.6, -1.68], %) // VerticalLineAbsolute
|> lineTo([5.04, -1.68], %) // MoveAbsolute
|> lineTo([5.04, -0], %) // VerticalLineAbsolute
|> lineTo([7.56, -0], %) // HorizontalLineAbsolute
|> lineTo([7.56, -1.68], %) // VerticalLineAbsolute
|> lineTo([8.4, -1.68], %) // HorizontalLineAbsolute
|> lineTo([8.4, -3.36], %) // VerticalLineAbsolute
|> lineTo([9.24, -3.36], %) // HorizontalLineAbsolute
|> lineTo([9.24, -4.62], %) // VerticalLineAbsolute
|> lineTo([8.4, -4.62], %) // HorizontalLineAbsolute
|> lineTo([8.4, -3.36], %) // VerticalLineAbsolute
|> lineTo([7.56, -3.36], %) // HorizontalLineAbsolute
|> lineTo([7.56, -1.68], %) // VerticalLineAbsolute
|> lineTo([6.72, -1.68], %) // HorizontalLineAbsolute
|> lineTo([6.72, -0.83999999], %) // VerticalLineAbsolute
|> lineTo([5.88, -0.83999999], %) // HorizontalLineAbsolute
|> lineTo([5.88, -1.68], %) // VerticalLineAbsolute
|> lineTo([5.04, -1.68], %) // HorizontalLineAbsolute
|> lineTo([5.04, -3.36], %) // VerticalLineAbsolute
|> lineTo([4.2, -3.36], %) // HorizontalLineAbsolute
|> lineTo([4.2, -4.62], %) // VerticalLineAbsolute
|> lineTo([3.36, -4.62], %) // HorizontalLineAbsolute
|> lineTo([3.36, -3.36], %) // VerticalLineAbsolute
|> lineTo([4.2, -3.36], %) // HorizontalLineAbsolute
|> lineTo([4.2, -1.68], %) // VerticalLineAbsolute
|> lineTo([13.44, -5.88], %) // MoveAbsolute
|> lineTo([13.44, -5.04], %) // VerticalLineAbsolute
|> lineTo([14.28, -5.04], %) // HorizontalLineRelative
|> lineTo([14.28, -4.2], %) // VerticalLineAbsolute
|> lineTo([15.12, -4.2], %) // HorizontalLineRelative
|> lineTo([15.12, -5.04], %) // VerticalLineHorizonal
|> lineTo([15.959999999999999, -5.04], %) // HorizontalLineRelative
|> lineTo([15.959999999999999, -5.88], %) // VerticalLineHorizonal
|> lineTo([5.88, -5.04], %) // MoveAbsolute
|> lineTo([5.88, -4.2], %) // VerticalLineAbsolute
|> lineTo([6.72, -4.2], %) // HorizontalLineAbsolute
|> lineTo([6.72, -5.04], %) // VerticalLineAbsolute
|> lineTo([7.56, -5.04], %) // HorizontalLineAbsolute
|> lineTo([7.56, -5.88], %) // VerticalLineAbsolute
|> lineTo([5.04, -5.88], %) // HorizontalLineAbsolute
|> lineTo([5.04, -5.04], %) // VerticalLineAbsolute
|> lineTo([17.64, -5.88], %) // MoveAbsolute
|> lineTo([17.64, -5.04], %) // VerticalLineAbsolute
|> lineTo([16.8, -5.04], %) // HorizontalLineAbsolute
|> lineTo([16.8, -4.2], %) // VerticalLineAbsolute
|> lineTo([17.64, -4.2], %) // HorizontalLineRelative
|> lineTo([17.64, -5.04], %) // VerticalLineHorizonal
|> lineTo([18.48, -5.04], %) // HorizontalLineRelative
|> lineTo([18.48, -5.88], %) // VerticalLineHorizonal
|> lineTo([3.36, -5.04], %) // MoveAbsolute
|> lineTo([3.36, -5.88], %) // VerticalLineAbsolute
|> lineTo([2.52, -5.88], %) // HorizontalLineAbsolute
|> lineTo([2.52, -5.04], %) // VerticalLineAbsolute
|> lineTo([3.36, -5.04], %) // HorizontalLineAbsolute
|> lineTo([3.36, -4.2], %) // VerticalLineAbsolute
|> lineTo([4.2, -4.2], %) // HorizontalLineAbsolute
|> lineTo([4.2, -5.04], %) // VerticalLineHorizonal
|> lineTo([8.4, -4.2], %) // MoveRelative
|> lineTo([9.24, -4.2], %) // HorizontalLineRelative
|> lineTo([9.24, -5.04], %) // VerticalLineHorizonal
|> lineTo([10.08, -5.04], %) // HorizontalLineRelative
|> lineTo([10.08, -5.88], %) // VerticalLineAbsolute
|> lineTo([9.24, -5.88], %) // HorizontalLineAbsolute
|> lineTo([9.24, -5.04], %) // VerticalLineAbsolute
|> lineTo([8.4, -5.04], %) // HorizontalLineAbsolute
|> lineTo([11.76, -4.2], %) // MoveAbsolute
|> lineTo([12.6, -4.2], %) // HorizontalLineAbsolute
|> lineTo([12.6, -5.04], %) // VerticalLineAbsolute
|> lineTo([11.76, -5.04], %) // HorizontalLineAbsolute
|> lineTo([11.76, -5.88], %) // VerticalLineAbsolute
|> lineTo([10.92, -5.88], %) // HorizontalLineAbsolute
|> lineTo([10.92, -5.04], %) // VerticalLineAbsolute
|> lineTo([11.76, -5.04], %) // HorizontalLineRelative
|> lineTo([14.28, -10.92], %) // MoveRelative
|> lineTo([13.44, -10.92], %) // HorizontalLineRelative
|> lineTo([13.44, -13.44], %) // VerticalLineHorizonal
|> lineTo([14.28, -13.44], %) // HorizontalLineRelative
|> close(%)
show(svg)

View File

@ -466,5 +466,5 @@ const svg = startSketchAt([0, 0])
|> bezierCurve({ control1: [-4, -3], control2: [-2.66, -3.67], to: [-3.32, -3.34] }, %) // CubicBezierAbsolute |> bezierCurve({ control1: [-4, -3], control2: [-2.66, -3.67], to: [-3.32, -3.34] }, %) // CubicBezierAbsolute
|> bezierCurve({ control1: [0, -2], control2: [-2.68, -2.67], to: [-1.36, -2.34] }, %) // CubicBezierAbsolute |> bezierCurve({ control1: [0, -2], control2: [-2.68, -2.67], to: [-1.36, -2.34] }, %) // CubicBezierAbsolute
|> bezierCurve({ control1: [0, -0], control2: [0, -1.34], to: [0, -0.68] }, %) // CubicBezierAbsolute |> bezierCurve({ control1: [0, -0], control2: [0, -1.34], to: [0, -0.68] }, %) // CubicBezierAbsolute
|> close(%); |> close(%)
show(svg); show(svg)

View File

@ -15,6 +15,7 @@ async fn execute_and_snapshot(code: &str) -> Result<image::DynamicImage> {
// For file conversions we need this to be long. // For file conversions we need this to be long.
.timeout(std::time::Duration::from_secs(600)) .timeout(std::time::Duration::from_secs(600))
.connect_timeout(std::time::Duration::from_secs(60)) .connect_timeout(std::time::Duration::from_secs(60))
.connection_verbose(true)
.tcp_keepalive(std::time::Duration::from_secs(600)) .tcp_keepalive(std::time::Duration::from_secs(600))
.http1_only(); .http1_only();
@ -31,16 +32,16 @@ async fn execute_and_snapshot(code: &str) -> Result<image::DynamicImage> {
// Create a temporary file to write the output to. // Create a temporary file to write the output to.
let output_file = std::env::temp_dir().join(format!("kcl_output_{}.png", uuid::Uuid::new_v4())); let output_file = std::env::temp_dir().join(format!("kcl_output_{}.png", uuid::Uuid::new_v4()));
let tokens = kcl_lib::tokeniser::lexer(code); let tokens = kcl_lib::token::lexer(code);
let parser = kcl_lib::parser::Parser::new(tokens); let parser = kcl_lib::parser::Parser::new(tokens);
let program = parser.ast()?; let program = parser.ast()?;
let mut mem: kcl_lib::executor::ProgramMemory = Default::default(); let mut mem: kcl_lib::executor::ProgramMemory = Default::default();
let mut engine = kcl_lib::engine::EngineConnection::new(ws).await?; let engine = kcl_lib::engine::EngineConnection::new(ws).await?;
let _ = kcl_lib::executor::execute(program, &mut mem, kcl_lib::executor::BodyType::Root, &mut engine)?; let _ = kcl_lib::executor::execute(program, &mut mem, kcl_lib::executor::BodyType::Root, &engine).await?;
// Send a snapshot request to the engine. // Send a snapshot request to the engine.
let resp = engine let resp = engine
.send_modeling_cmd_get_response( .send_modeling_cmd(
uuid::Uuid::new_v4(), uuid::Uuid::new_v4(),
kcl_lib::executor::SourceRange::default(), kcl_lib::executor::SourceRange::default(),
kittycad::types::ModelingCmd::TakeSnapshot { kittycad::types::ModelingCmd::TakeSnapshot {
@ -174,6 +175,14 @@ async fn serial_test_execute_pipes_on_pipes() {
twenty_twenty::assert_image("tests/executor/outputs/pipes_on_pipes.png", &result, 1.0); twenty_twenty::assert_image("tests/executor/outputs/pipes_on_pipes.png", &result, 1.0);
} }
#[tokio::test(flavor = "multi_thread")]
async fn serial_test_execute_kittycad_svg() {
let code = include_str!("inputs/kittycad_svg.kcl");
let result = execute_and_snapshot(code).await.unwrap();
twenty_twenty::assert_image("tests/executor/outputs/kittycad_svg.png", &result, 1.0);
}
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_member_expression_sketch_group() { async fn test_member_expression_sketch_group() {
let code = r#"fn cube = (pos, scale) => { let code = r#"fn cube = (pos, scale) => {
@ -201,3 +210,45 @@ show(b2)"#;
1.0, 1.0,
); );
} }
#[tokio::test(flavor = "multi_thread")]
async fn test_close_arc() {
let code = r#"const center = [0,0]
const radius = 40
const height = 3
const body = startSketchAt([center[0]+radius, center[1]])
|> arc({angle_end: 360, angle_start: 0, radius: radius}, %)
|> close(%)
|> extrude(height, %)
show(body)"#;
let result = execute_and_snapshot(code).await.unwrap();
twenty_twenty::assert_image("tests/executor/outputs/close_arc.png", &result, 1.0);
}
#[tokio::test(flavor = "multi_thread")]
async fn test_negative_args() {
let code = r#"const width = 5
const height = 10
const length = 12
fn box = (sk1, sk2, scale) => {
const boxSketch = startSketchAt([sk1, sk2])
|> line([0, scale], %)
|> line([scale, 0], %)
|> line([0, -scale], %)
|> close(%)
|> extrude(scale, %)
return boxSketch
}
box(0, 0, 5)
box(10, 23, 8)
let thing = box(-12, -15, 10)
box(-20, -5, 10)"#;
let result = execute_and_snapshot(code).await.unwrap();
twenty_twenty::assert_image("tests/executor/outputs/negative_args.png", &result, 1.0);
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 95 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 90 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 69 KiB

After

Width:  |  Height:  |  Size: 69 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 78 KiB

View File

@ -33,17 +33,13 @@ async fn setup(code: &str, name: &str) -> Result<(EngineConnection, Program, uui
.commands_ws(None, None, None, None, Some(false)) .commands_ws(None, None, None, None, Some(false))
.await?; .await?;
let tokens = kcl_lib::tokeniser::lexer(code); let tokens = kcl_lib::token::lexer(code);
let parser = kcl_lib::parser::Parser::new(tokens); let parser = kcl_lib::parser::Parser::new(tokens);
let program = parser.ast()?; let program = parser.ast()?;
let mut mem: kcl_lib::executor::ProgramMemory = Default::default(); let mut mem: kcl_lib::executor::ProgramMemory = Default::default();
let mut engine = kcl_lib::engine::EngineConnection::new(ws).await?; let engine = kcl_lib::engine::EngineConnection::new(ws).await?;
let memory = kcl_lib::executor::execute( let memory =
program.clone(), kcl_lib::executor::execute(program.clone(), &mut mem, kcl_lib::executor::BodyType::Root, &engine).await?;
&mut mem,
kcl_lib::executor::BodyType::Root,
&mut engine,
)?;
// We need to get the sketch ID. // We need to get the sketch ID.
// Get the sketch group ID from memory. // Get the sketch group ID from memory.
@ -53,38 +49,44 @@ async fn setup(code: &str, name: &str) -> Result<(EngineConnection, Program, uui
let sketch_id = sketch_group.id; let sketch_id = sketch_group.id;
let plane_id = uuid::Uuid::new_v4(); let plane_id = uuid::Uuid::new_v4();
engine.send_modeling_cmd( engine
plane_id, .send_modeling_cmd(
SourceRange::default(), plane_id,
ModelingCmd::MakePlane { SourceRange::default(),
clobber: false, ModelingCmd::MakePlane {
origin: Point3D { x: 0.0, y: 0.0, z: 0.0 }, clobber: false,
size: 60.0, origin: Point3D { x: 0.0, y: 0.0, z: 0.0 },
x_axis: Point3D { x: 1.0, y: 0.0, z: 0.0 }, size: 60.0,
y_axis: Point3D { x: 0.0, y: 1.0, z: 0.0 }, x_axis: Point3D { x: 1.0, y: 0.0, z: 0.0 },
}, y_axis: Point3D { x: 0.0, y: 1.0, z: 0.0 },
)?; },
)
.await?;
// Enter sketch mode. // Enter sketch mode.
// We can't get control points without being in sketch mode. // We can't get control points without being in sketch mode.
// You can however get path info without sketch mode. // You can however get path info without sketch mode.
engine.send_modeling_cmd( engine
uuid::Uuid::new_v4(), .send_modeling_cmd(
SourceRange::default(), uuid::Uuid::new_v4(),
ModelingCmd::SketchModeEnable { SourceRange::default(),
animated: false, ModelingCmd::SketchModeEnable {
ortho: true, animated: false,
plane_id, ortho: true,
}, plane_id,
)?; },
)
.await?;
// Enter edit mode. // Enter edit mode.
// We can't get control points of an existing sketch without being in edit mode. // We can't get control points of an existing sketch without being in edit mode.
engine.send_modeling_cmd( engine
uuid::Uuid::new_v4(), .send_modeling_cmd(
SourceRange::default(), uuid::Uuid::new_v4(),
ModelingCmd::EditModeEnter { target: sketch_id }, SourceRange::default(),
)?; ModelingCmd::EditModeEnter { target: sketch_id },
)
.await?;
Ok((engine, program, sketch_id)) Ok((engine, program, sketch_id))
} }

View File

@ -1530,10 +1530,10 @@
resolved "https://registry.yarnpkg.com/@juggle/resize-observer/-/resize-observer-3.4.0.tgz#08d6c5e20cf7e4cc02fd181c4b0c225cd31dbb60" resolved "https://registry.yarnpkg.com/@juggle/resize-observer/-/resize-observer-3.4.0.tgz#08d6c5e20cf7e4cc02fd181c4b0c225cd31dbb60"
integrity sha512-dfLbk+PwWvFzSxwk3n5ySL0hfBog779o8h68wK/7/APo/7cgyWp5jcXockbxdk5kFRkbeXWm4Fbi9FrdN381sA== integrity sha512-dfLbk+PwWvFzSxwk3n5ySL0hfBog779o8h68wK/7/APo/7cgyWp5jcXockbxdk5kFRkbeXWm4Fbi9FrdN381sA==
"@kittycad/lib@^0.0.37": "@kittycad/lib@^0.0.39":
version "0.0.37" version "0.0.39"
resolved "https://registry.yarnpkg.com/@kittycad/lib/-/lib-0.0.37.tgz#ec4f6c4fb5d06402a19339f3374036b6582d2265" resolved "https://registry.yarnpkg.com/@kittycad/lib/-/lib-0.0.39.tgz#e548acf5ff7d45a1f1ec9ad2c61ddcfc30d159b7"
integrity sha512-P8p9FeLV79/0Lfd0RioBta1drzhmpROnU4YV38+zsAA4LhibQCTjeekRkxVvHztGumPxz9pPsAeeLJyuz2RWKQ== integrity sha512-cB4wNjsKTMpJUn/kMK3qtkVAqB1csSglqThe+bj02nC1kWTB1XgYxksooc/Gzl1MoK1/n0OPQcbOb7Tojb836A==
dependencies: dependencies:
node-fetch "3.3.2" node-fetch "3.3.2"
openapi-types "^12.0.0" openapi-types "^12.0.0"
@ -1883,10 +1883,10 @@
resolved "https://registry.yarnpkg.com/@types/crypto-js/-/crypto-js-4.1.1.tgz#602859584cecc91894eb23a4892f38cfa927890d" resolved "https://registry.yarnpkg.com/@types/crypto-js/-/crypto-js-4.1.1.tgz#602859584cecc91894eb23a4892f38cfa927890d"
integrity sha512-BG7fQKZ689HIoc5h+6D2Dgq1fABRa0RbBWKBd9SP/MVRVXROflpm5fhwyATX5duFmbStzyzyycPB8qUYKDH3NA== integrity sha512-BG7fQKZ689HIoc5h+6D2Dgq1fABRa0RbBWKBd9SP/MVRVXROflpm5fhwyATX5duFmbStzyzyycPB8qUYKDH3NA==
"@types/debounce@^1.2.1": "@types/debounce-promise@^3.1.6":
version "1.2.1" version "3.1.6"
resolved "https://registry.yarnpkg.com/@types/debounce/-/debounce-1.2.1.tgz#79b65710bc8b6d44094d286aecf38e44f9627852" resolved "https://registry.yarnpkg.com/@types/debounce-promise/-/debounce-promise-3.1.6.tgz#873e838574011095ed0debf73eed3538e1261d75"
integrity sha512-epMsEE85fi4lfmJUH/89/iV/LI+F5CvNIvmgs5g5jYFPfhO2S/ae8WSsLOKWdwtoaZw9Q2IhJ4tQ5tFCcS/4HA== integrity sha512-DowqK95aku+OxMCeG2EQSeXeGeE8OCwLpMsUfIbP7hMF8Otj8eQXnzpwdtIKV+UqQBtkMcF6vbi4Otbh8P/wmg==
"@types/eslint@^8.4.5": "@types/eslint@^8.4.5":
version "8.44.1" version "8.44.1"
@ -2806,6 +2806,11 @@ data-uri-to-buffer@^4.0.0:
resolved "https://registry.yarnpkg.com/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz#d8feb2b2881e6a4f58c2e08acfd0e2834e26222e" resolved "https://registry.yarnpkg.com/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz#d8feb2b2881e6a4f58c2e08acfd0e2834e26222e"
integrity sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A== integrity sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A==
debounce-promise@^3.1.2:
version "3.1.2"
resolved "https://registry.yarnpkg.com/debounce-promise/-/debounce-promise-3.1.2.tgz#320fb8c7d15a344455cd33cee5ab63530b6dc7c5"
integrity sha512-rZHcgBkbYavBeD9ej6sP56XfG53d51CD4dnaw989YX/nZ/ZJfgRx/9ePKmTNiUiyQvh4mtrMoS3OAWW+yoYtpg==
debug@^3.2.7: debug@^3.2.7:
version "3.2.7" version "3.2.7"
resolved "https://registry.yarnpkg.com/debug/-/debug-3.2.7.tgz#72580b7e9145fb39b6676f9c5e5fb100b934179a" resolved "https://registry.yarnpkg.com/debug/-/debug-3.2.7.tgz#72580b7e9145fb39b6676f9c5e5fb100b934179a"