Compare commits

..

36 Commits

Author SHA1 Message Date
12b3717eb5 Cut release v0.9.2 (#714) 2023-09-26 20:39:05 -04:00
0bc685b0c4 Bump tungstenite from 0.20.0 to 0.20.1 in /src/wasm-lib/kcl/fuzz (#709)
Bumps [tungstenite](https://github.com/snapview/tungstenite-rs) from 0.20.0 to 0.20.1.
- [Changelog](https://github.com/snapview/tungstenite-rs/blob/master/CHANGELOG.md)
- [Commits](https://github.com/snapview/tungstenite-rs/compare/v0.20.0...v0.20.1)

---
updated-dependencies:
- dependency-name: tungstenite
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-25 20:49:16 -07:00
9ee032771a unused dep (#710) 2023-09-26 03:22:05 +00:00
c307ddd1b1 resize (#706)
* start of resize

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* refactor

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* check if 0

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* will work w new lib

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* new types

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* handle resize effect

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
Co-authored-by: Kurt Hutten Irev-Dev <k.hutten@protonmail.ch>
2023-09-25 19:49:53 -07:00
a30818ff2b fixes negative args in function (#707)
* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
2023-09-25 15:25:58 -07:00
53e763d938 fix close arc (#704)
* fix close arc

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* much bigger radius

Signed-off-by: Jess Frazelle <github@jessfraz.com>

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
2023-09-25 12:14:41 -07:00
8f74cd1d0c Bump tauri-plugin-fs-extra from 0190f68 to b04bde3 in /src-tauri (#702)
Bumps [tauri-plugin-fs-extra](https://github.com/tauri-apps/plugins-workspace) from `0190f68` to `b04bde3`.
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](0190f68f1d...b04bde3461)

---
updated-dependencies:
- dependency-name: tauri-plugin-fs-extra
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-25 09:33:53 -07:00
c271942897 remove errors (#703) 2023-09-25 07:28:03 +00:00
a03d09b41d Restructure tokenizer module (#700)
* Remove duplicated tests

These tests already were copied to tokeniser2.rs, so removing them doesn't affect code coverage.

* Move tokeniser to its own module

Now there's a module for tokens, and the tokenizer/lexer implementation is private within the token module.
2023-09-24 20:01:17 -05:00
2971b7752b Bump rust websocket libraries (#701)
Changelog: https://github.com/snapview/tokio-tungstenite/blob/master/CHANGELOG.md#0201
2023-09-24 21:34:31 +00:00
70e99eb00b Refactor is_code_token into a method (#699)
* Refactor is_code_token into a method

* Fix typos, use Parser as it was imported
2023-09-24 21:11:36 +00:00
5c66af59d2 New tokenizer based on winnow (#697)
* New tokenizer, using Winnow instead of regexes.

Between 1.3x and 4.4x speedup on lexer benchmarks :)

* Use dispatch instead of alt

Most of the time, if you know the first character of a token, you can narrow down its potential possible token types, instead of just trying each token type until one succeeds.

This further speeds up the lexer. Compared to main, this branch is now between 3x and 12x faster than main.
2023-09-22 21:57:39 -05:00
6dda6daeef Use separate benchmarks for lexing and parsing (#698) 2023-09-23 02:01:18 +00:00
b5387f1220 Cut release v0.9.1 (#693)
* bump to ️v0.9.1️

* update bump instructions

* readme update

* read me again

* change pr convention
2023-09-22 10:38:17 +10:00
fd5921b366 Convert the lexer to be iterative not recursive (#691)
This is often more memory-efficient (does not create a bunch of stack
frames)
2023-09-21 19:19:08 -05:00
716ad938fc stop gap for large files making editor slow (#690)
Signed-off-by: Jess Frazelle <github@jessfraz.com>
2023-09-21 16:13:22 -07:00
40136eb392 Bump kittycad from 0.2.25 to 0.2.26 in /src-tauri (#680)
Bumps [kittycad](https://github.com/KittyCAD/kittycad.rs) from 0.2.25 to 0.2.26.
- [Release notes](https://github.com/KittyCAD/kittycad.rs/releases)
- [Commits](https://github.com/KittyCAD/kittycad.rs/compare/v0.2.25...v0.2.26)

---
updated-dependencies:
- dependency-name: kittycad
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 14:05:55 -07:00
8d2b89fcd1 Bump openapitor from 0d121f6 to 61a1605 in /src/wasm-lib (#679)
Bumps [openapitor](https://github.com/KittyCAD/kittycad.rs) from `0d121f6` to `61a1605`.
- [Release notes](https://github.com/KittyCAD/kittycad.rs/releases)
- [Commits](0d121f6881...61a16059b3)

---
updated-dependencies:
- dependency-name: openapitor
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:22:04 -05:00
ad9fba3390 Bump tauri-plugin-fs-extra from 76832e6 to 0190f68 in /src-tauri (#681)
Bumps [tauri-plugin-fs-extra](https://github.com/tauri-apps/plugins-workspace) from `76832e6` to `0190f68`.
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](76832e60bf...0190f68f1d)

---
updated-dependencies:
- dependency-name: tauri-plugin-fs-extra
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:21:08 -05:00
911c43af50 Bump phonenumber from 0.3.2+8.13.9 to 0.3.3+8.13.9 in /src-tauri (#687)
Bumps [phonenumber](https://github.com/whisperfish/rust-phonenumber) from 0.3.2+8.13.9 to 0.3.3+8.13.9.
- [Commits](https://github.com/whisperfish/rust-phonenumber/commits)

---
updated-dependencies:
- dependency-name: phonenumber
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:02:37 -05:00
ab4e04f6c2 Bump phonenumber from 0.3.2+8.13.9 to 0.3.3+8.13.9 in /src/wasm-lib (#685)
Bumps [phonenumber](https://github.com/whisperfish/rust-phonenumber) from 0.3.2+8.13.9 to 0.3.3+8.13.9.
- [Commits](https://github.com/whisperfish/rust-phonenumber/commits)

---
updated-dependencies:
- dependency-name: phonenumber
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:02:23 -05:00
94aef05f74 Bump phonenumber from 0.3.2+8.13.9 to 0.3.3+8.13.9 in /src/wasm-lib/kcl/fuzz (#686)
Bump phonenumber in /src/wasm-lib/kcl/fuzz

Bumps [phonenumber](https://github.com/whisperfish/rust-phonenumber) from 0.3.2+8.13.9 to 0.3.3+8.13.9.
- [Commits](https://github.com/whisperfish/rust-phonenumber/commits)

---
updated-dependencies:
- dependency-name: phonenumber
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-21 15:02:10 -05:00
d820cf2446 Tokenizer is accidentally quadratic (#689)
* Add comments and rename a function

* Typo: paran -> paren

* Use bytes, not string, for the tokenizer

* Fix typo
2023-09-21 14:18:42 -05:00
0c724c4971 Start to restructure the Engine's connection to the backend (#674)
* Start to restructure the Engine's connectio to the backend

1) Add in a tearDown stub for when the Engine is torn down. This is now
   distinct from a 'close', which will not stop connect from trying
   again. Running tearDown will mark the connection to not be retried
   and close active connections.

2) Move the retry logic out of connect and into the constructor. It will
   attempt to reconnect at the same rate as we had previously.

3) The timeout will now only close the connection, not restart it.

Signed-off-by: Paul Tagliamonte <paul@kittycad.io>

* Don't continue on dead conn & setTimeout on init only

* Clean up extra setTimeout

* Keep track of connection timeouts and clear on close

* Fix tsc by defining Timeout

Signed-off-by: Paul Tagliamonte <paul@kittycad.io>

* appease the format gods

---------

Signed-off-by: Paul Tagliamonte <paul@kittycad.io>
Co-authored-by: Adam Sunderland <adam@kittycad.io>
2023-09-21 12:07:47 -04:00
b54ac4a694 improve getNodePathFromSourceRange and therefore the ast explorer aswell (#683)
improve getNodePathFromSourceRange and therefore the ast explorer as well
2023-09-21 05:40:41 +00:00
27227092b1 app stuck on blur when engine errors (#682)
* temp fix for when engine returns error

* don't add extrued to show function
2023-09-21 04:32:47 +00:00
04e1b92a5b Add a benchmark for parsing pipes-on-pipes (#678) 2023-09-21 03:13:07 +00:00
0553cd4621 tests for big files (#675)
* shit;

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* cleanup

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* u[dates;

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* still ignore the big one

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* Add big kitt SVG to benchmarks

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
Co-authored-by: Adam Chalmers <adam.s.chalmers@gmail.com>
2023-09-20 19:35:37 -07:00
61a0c88af4 Add IDE dirs to .gitignore (#676) 2023-09-21 02:03:09 +00:00
d5b0544437 Bump tauri-plugin-fs-extra from 5b814f5 to 76832e6 in /src-tauri (#657)
Bumps [tauri-plugin-fs-extra](https://github.com/tauri-apps/plugins-workspace) from `5b814f5` to `76832e6`.
- [Release notes](https://github.com/tauri-apps/plugins-workspace/releases)
- [Commits](5b814f56e6...76832e60bf)

---
updated-dependencies:
- dependency-name: tauri-plugin-fs-extra
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-20 18:47:38 -07:00
6cc8af5c23 make stdlib functions async (#672)
* wip

Signed-off-by: Jess Frazelle <github@jessfraz.com>

updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

closer

Signed-off-by: Jess Frazelle <github@jessfraz.com>

fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* closer

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* closer

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* compiles

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* connection

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fix wasm

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* timeout

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* remove the drop

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* drop handle

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fix

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fixes

Signed-off-by: Jess Frazelle <github@jessfraz.com>

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
2023-09-20 18:27:08 -07:00
888104080e bump v0.9.0 (#673) 2023-09-21 10:38:40 +10:00
b6769889e3 Handle relative paths at kcl level (#506)
* handle relative paths at kcl level

* fmt

* update kittycad

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* updates

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* bump

Signed-off-by: Jess Frazelle <github@jessfraz.com>

* fix tests

Signed-off-by: Jess Frazelle <github@jessfraz.com>

---------

Signed-off-by: Jess Frazelle <github@jessfraz.com>
Co-authored-by: Jess Frazelle <github@jessfraz.com>
Co-authored-by: Jess Frazelle <jessfraz@users.noreply.github.com>
2023-09-21 10:36:26 +10:00
a32258dac4 Engine manager can be cloned (#671) 2023-09-20 16:22:47 -07:00
18dbbad244 Use an actor to manage the Tokio engine connection (#669)
* Use an actor to manage the Tokio engine connection

This means EngineManager trait's methods take &self not &mut self, and the tokio implementation can be cloned.

* Clean up code
2023-09-20 16:59:03 -05:00
b67c16cc9d Benchmark for KCL parser (#664)
* KCL benchmarks

* CI for benchmarks

* More specific name for benchmark

* Benchmark the right directory

* Format
2023-09-20 13:15:28 -05:00
77 changed files with 3785 additions and 1825 deletions

View File

@ -54,4 +54,4 @@ jobs:
- name: Run clippy
run: |
cd "${{ matrix.dir }}"
cargo clippy --all --tests -- -D warnings
cargo clippy --all --tests --benches -- -D warnings

37
.github/workflows/cargo-criterion.yml vendored Normal file
View File

@ -0,0 +1,37 @@
on:
push:
branches:
- main
paths:
- '**.rs'
- '**/Cargo.toml'
- '**/Cargo.lock'
- '**/rust-toolchain.toml'
- .github/workflows/cargo-criterion.yml
pull_request:
paths:
- '**.rs'
- '**/Cargo.toml'
- '**/Cargo.lock'
- '**/rust-toolchain.toml'
- .github/workflows/cargo-criterion.yml
workflow_dispatch:
permissions: read-all
name: cargo criterion
jobs:
cargocriterion:
name: cargo criterion
runs-on: ubuntu-latest-8-cores
steps:
- uses: actions/checkout@v4
- uses: dtolnay/rust-toolchain@stable
- name: Install dependencies
run: |
cargo install cargo-criterion
- name: Rust Cache
uses: Swatinem/rust-cache@v2.6.1
- name: Benchmark kcl library
shell: bash
run: |-
cd src/wasm-lib/kcl; cargo criterion

View File

@ -58,4 +58,5 @@ jobs:
cargo nextest run --workspace --no-fail-fast -P ci
env:
KITTYCAD_API_TOKEN: ${{secrets.KITTYCAD_API_TOKEN}}
RUST_MIN_STACK: 10485760000

5
.gitignore vendored
View File

@ -22,6 +22,11 @@ npm-debug.log*
yarn-debug.log*
yarn-error.log*
.idea
.vscode
src/wasm-lib/.idea
src/wasm-lib/.vscode
# rust
src/wasm-lib/target
src/wasm-lib/bindings

View File

@ -123,13 +123,24 @@ Before you submit a contribution PR to this repo, please ensure that:
## Release a new version
1. Bump the versions in the .json files by creating a `Bump to v{x}.{y}.{z}` PR, committing the changes from
1. Bump the versions in the .json files by creating a `Cut release v{x}.{y}.{z}` PR, committing the changes from
```bash
VERSION=x.y.z yarn run bump-jsons
```
The PR may serve as a place to discuss the human-readable changelog and extra QA.
The PR may serve as a place to discuss the human-readable changelog and extra QA. A quick way of getting PR's merged since the last bump is to [use this PR filter](https://github.com/KittyCAD/modeling-app/pulls?q=is%3Apr+sort%3Aupdated-desc+is%3Amerged+), open up the browser console and past in the following
```typescript
console.log(
'- ' +
Array.from(
document.querySelectorAll('[data-hovercard-type="pull_request"]')
).map((a) => `[${a.innerText}](${a.href})`).join(`
- `)
)
```
grab the md list and delete any that are older than the last bump
2. Merge the PR

View File

@ -1,6 +1,6 @@
{
"name": "untitled-app",
"version": "0.8.2",
"version": "0.9.2",
"private": true,
"dependencies": {
"@codemirror/autocomplete": "^6.9.0",
@ -10,7 +10,7 @@
"@fortawesome/react-fontawesome": "^0.2.0",
"@headlessui/react": "^1.7.13",
"@headlessui/tailwindcss": "^0.2.0",
"@kittycad/lib": "^0.0.37",
"@kittycad/lib": "^0.0.39",
"@lezer/javascript": "^1.4.7",
"@open-rpc/client-js": "^1.8.1",
"@react-hook/resize-observer": "^1.2.6",
@ -27,6 +27,7 @@
"@uiw/react-codemirror": "^4.21.13",
"@xstate/react": "^3.2.2",
"crypto-js": "^4.1.1",
"debounce-promise": "^3.1.2",
"formik": "^2.4.3",
"fuse.js": "^6.6.2",
"http-server": "^14.1.1",
@ -57,7 +58,7 @@
"zustand": "^4.1.4"
},
"scripts": {
"start": "BROWSER=none vite",
"start": "vite",
"build": "curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y && source \"$HOME/.cargo/env\" && curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh -s -- -y && yarn build:wasm && vite build",
"build:local": "vite build",
"build:both": "vite build",
@ -65,7 +66,7 @@
"pretest": "yarn remove-importmeta",
"test": "vitest --mode development",
"test:nowatch": "vitest run --mode development",
"test:rust": "(cd src/wasm-lib && cargo test --all && cargo clippy --all --tests)",
"test:rust": "(cd src/wasm-lib && cargo test --all && cargo clippy --all --tests --benches)",
"test:cov": "vitest run --coverage --mode development",
"simpleserver:ci": "yarn pretest && http-server ./public --cors -p 3000 &",
"simpleserver": "yarn pretest && http-server ./public --cors -p 3000",
@ -101,7 +102,7 @@
"@babel/preset-env": "^7.22.9",
"@tauri-apps/cli": "^1.3.1",
"@types/crypto-js": "^4.1.1",
"@types/debounce": "^1.2.1",
"@types/debounce-promise": "^3.1.6",
"@types/isomorphic-fetch": "^0.0.36",
"@types/react-modal": "^3.16.0",
"@types/uuid": "^9.0.1",

78
src-tauri/Cargo.lock generated
View File

@ -155,6 +155,20 @@ version = "0.21.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "604178f6c5c21f02dc555784810edfb88d34ac2c73b2eae109655649ee73ce3d"
[[package]]
name = "bigdecimal"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "454bca3db10617b88b566f205ed190aedb0e0e6dd4cad61d3988a72e8c5594cb"
dependencies = [
"autocfg",
"libm",
"num-bigint",
"num-integer",
"num-traits",
"serde",
]
[[package]]
name = "bincode"
version = "1.3.3"
@ -1630,13 +1644,14 @@ dependencies = [
[[package]]
name = "kittycad"
version = "0.2.25"
version = "0.2.26"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9cf962b1e81a0b4eb923a727e761b40672cbacc7f5f0b75e13579d346352bc7"
checksum = "e2623ee601ce203476229df3f9d3a14664cb43e3f7455e9ac8ed91aacaa6163d"
dependencies = [
"anyhow",
"async-trait",
"base64 0.21.2",
"bigdecimal",
"bytes",
"chrono",
"data-encoding",
@ -1688,6 +1703,12 @@ version = "0.2.148"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9cdc71e17332e86d2e1d38c1f99edcb6288ee11b815fb1a4b049eaa2114d369b"
[[package]]
name = "libm"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7012b1bbb0719e1097c47611d3898568c546d597c2e74d66f6087edd5233ff4"
[[package]]
name = "line-wrap"
version = "0.1.1"
@ -1959,6 +1980,17 @@ dependencies = [
"winapi",
]
[[package]]
name = "num-bigint"
version = "0.4.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "608e7659b5c3d7cba262d894801b9ec9d00de989e8a82bd4bef91d08da45cdc0"
dependencies = [
"autocfg",
"num-integer",
"num-traits",
]
[[package]]
name = "num-integer"
version = "0.1.45"
@ -1982,9 +2014,9 @@ dependencies = [
[[package]]
name = "num-traits"
version = "0.2.15"
version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "578ede34cf02f8924ab9447f50c28075b4d3e5b269972345e7e0372b38c6cdcd"
checksum = "f30b0abd723be7e2ffca1272140fac1a2f084c77ec3e123c192b66af1ee9e6c2"
dependencies = [
"autocfg",
]
@ -2410,9 +2442,9 @@ dependencies = [
[[package]]
name = "phonenumber"
version = "0.3.2+8.13.9"
version = "0.3.3+8.13.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34749f64ea9d76f10cdc8a859588b57775f59177c7dd91f744d620bd62982d6f"
checksum = "635f3e6288e4f01c049d89332a031bd74f25d64b6fb94703ca966e819488cd06"
dependencies = [
"bincode",
"either",
@ -2425,6 +2457,7 @@ dependencies = [
"regex-cache",
"serde",
"serde_derive",
"strum",
"thiserror",
]
@ -3021,10 +3054,11 @@ dependencies = [
[[package]]
name = "schemars"
version = "0.8.13"
version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "763f8cd0d4c71ed8389c90cb8100cba87e763bd01a8e614d4f0af97bcd50a161"
checksum = "1f7b0ce13155372a76ee2e1c5ffba1fe61ede73fbea5630d61eee6fac4929c0c"
dependencies = [
"bigdecimal",
"bytes",
"chrono",
"dyn-clone",
@ -3037,9 +3071,9 @@ dependencies = [
[[package]]
name = "schemars_derive"
version = "0.8.13"
version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec0f696e21e10fa546b7ffb1c9672c6de8fbc7a81acf59524386d8639bf12737"
checksum = "e85e2a16b12bdb763244c69ab79363d71db2b4b918a2def53f80b02e0574b13c"
dependencies = [
"proc-macro2",
"quote",
@ -3458,6 +3492,28 @@ dependencies = [
"syn 2.0.33",
]
[[package]]
name = "strum"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "063e6045c0e62079840579a7e47a355ae92f60eb74daaf156fb1e84ba164e63f"
dependencies = [
"strum_macros",
]
[[package]]
name = "strum_macros"
version = "0.24.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e385be0d24f186b4ce2f9982191e7101bb737312ad61c1f2f984f34bcf85d59"
dependencies = [
"heck 0.4.1",
"proc-macro2",
"quote",
"rustversion",
"syn 1.0.109",
]
[[package]]
name = "syn"
version = "1.0.109"
@ -3719,7 +3775,7 @@ dependencies = [
[[package]]
name = "tauri-plugin-fs-extra"
version = "0.0.0"
source = "git+https://github.com/tauri-apps/plugins-workspace?branch=v1#5b814f56e6368fdec46c4ddb04a07e0923ff995a"
source = "git+https://github.com/tauri-apps/plugins-workspace?branch=v1#b04bde3461066c709d6801cf9ca305cf889a8394"
dependencies = [
"log",
"serde",

View File

@ -16,7 +16,7 @@ tauri-build = { version = "1.4.0", features = [] }
[dependencies]
anyhow = "1"
kittycad = "0.2.25"
kittycad = "0.2.26"
oauth2 = "4.4.2"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"

View File

@ -8,7 +8,7 @@
},
"package": {
"productName": "kittycad-modeling",
"version": "0.8.2"
"version": "0.9.2"
},
"tauri": {
"allowlist": {

View File

@ -31,6 +31,7 @@ import { TextEditor } from 'components/TextEditor'
import { Themes, getSystemTheme } from 'lib/theme'
import { useSetupEngineManager } from 'hooks/useSetupEngineManager'
import { useEngineConnectionSubscriptions } from 'hooks/useEngineConnectionSubscriptions'
import { engineCommandManager } from './lang/std/engineConnection'
export function App() {
const { code: loadedCode, project } = useLoaderData() as IndexLoaderData
@ -39,7 +40,6 @@ export function App() {
useHotKeyListener()
const {
setCode,
engineCommandManager,
buttonDownInStream,
openPanes,
setOpenPanes,
@ -52,7 +52,6 @@ export function App() {
guiMode: s.guiMode,
setGuiMode: s.setGuiMode,
setCode: s.setCode,
engineCommandManager: s.engineCommandManager,
buttonDownInStream: s.buttonDownInStream,
openPanes: s.openPanes,
setOpenPanes: s.setOpenPanes,
@ -91,12 +90,12 @@ export function App() {
if (guiMode.sketchMode === 'sketchEdit') {
// TODO: share this with Toolbar's "Exit sketch" button
// exiting sketch should be done consistently across all exits
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: { type: 'edit_mode_exit' },
})
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: { type: 'default_camera_disable_sketch_mode' },
@ -107,7 +106,7 @@ export function App() {
// when exiting sketch mode in the future
executeAst()
} else {
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
@ -156,7 +155,7 @@ export function App() {
useEngineConnectionSubscriptions()
const debounceSocketSend = throttle<EngineCommand>((message) => {
engineCommandManager?.sendSceneCommand(message)
engineCommandManager.sendSceneCommand(message)
}, 16)
const handleMouseMove: MouseEventHandler<HTMLDivElement> = (e) => {
e.nativeEvent.preventDefault()
@ -216,7 +215,6 @@ export function App() {
} else if (interactionGuards.zoom.dragCallback(eWithButton)) {
interaction = 'zoom'
} else {
console.log('none')
return
}

View File

@ -18,6 +18,7 @@ import styles from './Toolbar.module.css'
import { v4 as uuidv4 } from 'uuid'
import { useAppMode } from 'hooks/useAppMode'
import { ActionIcon } from 'components/ActionIcon'
import { engineCommandManager } from './lang/std/engineConnection'
export const sketchButtonClassnames = {
background:
@ -50,7 +51,6 @@ export const Toolbar = () => {
ast,
updateAst,
programMemory,
engineCommandManager,
executeAst,
} = useStore((s) => ({
guiMode: s.guiMode,
@ -59,15 +59,10 @@ export const Toolbar = () => {
ast: s.ast,
updateAst: s.updateAst,
programMemory: s.programMemory,
engineCommandManager: s.engineCommandManager,
executeAst: s.executeAst,
}))
useAppMode()
useEffect(() => {
console.log('guiMode', guiMode)
}, [guiMode])
function ToolbarButtons({ className }: React.HTMLAttributes<HTMLElement>) {
return (
<span className={styles.toolbarButtons + ' ' + className}>
@ -173,12 +168,12 @@ export const Toolbar = () => {
{guiMode.mode === 'sketch' && (
<button
onClick={() => {
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: { type: 'edit_mode_exit' },
})
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: { type: 'default_camera_disable_sketch_mode' },
@ -214,7 +209,7 @@ export const Toolbar = () => {
<button
key={sketchFnName}
onClick={() => {
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {

View File

@ -10,6 +10,7 @@ import {
} from '../lang/modifyAst'
import { findAllPreviousVariables, PrevVariable } from '../lang/queryAst'
import { useStore } from '../useStore'
import { engineCommandManager } from '../lang/std/engineConnection'
export const AvailableVars = ({
onVarClick,
@ -92,14 +93,11 @@ export function useCalc({
newVariableInsertIndex: number
setNewVariableName: (a: string) => void
} {
const { ast, programMemory, selectionRange, engineCommandManager } = useStore(
(s) => ({
ast: s.ast,
programMemory: s.programMemory,
selectionRange: s.selectionRanges.codeBasedSelections[0].range,
engineCommandManager: s.engineCommandManager,
})
)
const { ast, programMemory, selectionRange } = useStore((s) => ({
ast: s.ast,
programMemory: s.programMemory,
selectionRange: s.selectionRanges.codeBasedSelections[0].range,
}))
const inputRef = useRef<HTMLInputElement>(null)
const [availableVarInfo, setAvailableVarInfo] = useState<
ReturnType<typeof findAllPreviousVariables>
@ -140,7 +138,6 @@ export function useCalc({
}, [ast, programMemory, selectionRange])
useEffect(() => {
if (!engineCommandManager) return
try {
const code = `const __result__ = ${value}\nshow(__result__)`
const ast = parser_wasm(code)

View File

@ -1,5 +1,4 @@
import { CollapsiblePanel, CollapsiblePanelProps } from './CollapsiblePanel'
import { useStore } from '../useStore'
import { v4 as uuidv4 } from 'uuid'
import { EngineCommand } from '../lang/std/engineConnection'
import { useState } from 'react'
@ -7,6 +6,7 @@ import { ActionButton } from '../components/ActionButton'
import { faCheck } from '@fortawesome/free-solid-svg-icons'
import { isReducedMotion } from 'lang/util'
import { AstExplorer } from './AstExplorer'
import { engineCommandManager } from '../lang/std/engineConnection'
type SketchModeCmd = Extract<
Extract<EngineCommand, { type: 'modeling_cmd_req' }>['cmd'],
@ -14,9 +14,6 @@ type SketchModeCmd = Extract<
>
export const DebugPanel = ({ className, ...props }: CollapsiblePanelProps) => {
const { engineCommandManager } = useStore((s) => ({
engineCommandManager: s.engineCommandManager,
}))
const [sketchModeCmd, setSketchModeCmd] = useState<SketchModeCmd>({
type: 'default_camera_enable_sketch_mode',
origin: { x: 0, y: 0, z: 0 },
@ -70,19 +67,18 @@ export const DebugPanel = ({ className, ...props }: CollapsiblePanelProps) => {
className="w-16"
type="checkbox"
checked={sketchModeCmd.ortho}
onChange={(a) => {
console.log(a, (a as any).checked)
onChange={(a) =>
setSketchModeCmd({
...sketchModeCmd,
ortho: a.target.checked,
})
}}
}
/>
</div>
<ActionButton
Element="button"
onClick={() => {
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd: sketchModeCmd,
cmd_id: uuidv4(),

View File

@ -1,11 +1,11 @@
import { v4 as uuidv4 } from 'uuid'
import { useStore } from '../useStore'
import { faFileExport, faXmark } from '@fortawesome/free-solid-svg-icons'
import { ActionButton } from './ActionButton'
import Modal from 'react-modal'
import React from 'react'
import { useFormik } from 'formik'
import { Models } from '@kittycad/lib'
import { engineCommandManager } from '../lang/std/engineConnection'
type OutputFormat = Models['OutputFormat_type']
@ -18,10 +18,6 @@ interface ExportButtonProps extends React.PropsWithChildren {
}
export const ExportButton = ({ children, className }: ExportButtonProps) => {
const { engineCommandManager } = useStore((s) => ({
engineCommandManager: s.engineCommandManager,
}))
const [modalIsOpen, setIsOpen] = React.useState(false)
const defaultType = 'gltf'
@ -66,7 +62,7 @@ export const ExportButton = ({ children, className }: ExportButtonProps) => {
},
}
}
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd: {
type: 'export',

View File

@ -50,7 +50,7 @@ export const MemoryPanel = ({
export const processMemory = (programMemory: ProgramMemory) => {
const processedMemory: any = {}
Object.keys(programMemory.root).forEach((key) => {
Object.keys(programMemory?.root || {}).forEach((key) => {
const val = programMemory.root[key]
if (typeof val.value !== 'function') {
if (val.type === 'SketchGroup') {

View File

@ -25,6 +25,7 @@ import { modify_ast_for_sketch } from '../wasm-lib/pkg/wasm_lib'
import { KCLError } from 'lang/errors'
import { KclError as RustKclError } from '../wasm-lib/kcl/bindings/KclError'
import { rangeTypeFix } from 'lang/abstractSyntaxTree'
import { engineCommandManager } from '../lang/std/engineConnection'
export const Stream = ({ className = '' }) => {
const [isLoading, setIsLoading] = useState(true)
@ -32,7 +33,6 @@ export const Stream = ({ className = '' }) => {
const videoRef = useRef<HTMLVideoElement>(null)
const {
mediaStream,
engineCommandManager,
setButtonDownInStream,
didDragInStream,
setDidDragInStream,
@ -45,7 +45,6 @@ export const Stream = ({ className = '' }) => {
programMemory,
} = useStore((s) => ({
mediaStream: s.mediaStream,
engineCommandManager: s.engineCommandManager,
setButtonDownInStream: s.setButtonDownInStream,
fileId: s.fileId,
didDragInStream: s.didDragInStream,
@ -73,7 +72,7 @@ export const Stream = ({ className = '' }) => {
if (!videoRef.current) return
if (!mediaStream) return
videoRef.current.srcObject = mediaStream
}, [mediaStream, engineCommandManager])
}, [mediaStream])
const handleMouseDown: MouseEventHandler<HTMLVideoElement> = (e) => {
if (!videoRef.current) return
@ -107,7 +106,7 @@ export const Stream = ({ className = '' }) => {
}
if (guiMode.mode === 'sketch' && guiMode.sketchMode === ('move' as any)) {
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd: {
type: 'handle_mouse_drag_start',
@ -121,7 +120,7 @@ export const Stream = ({ className = '' }) => {
guiMode.sketchMode === ('sketch_line' as any)
)
) {
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd: {
type: 'camera_drag_start',
@ -139,7 +138,7 @@ export const Stream = ({ className = '' }) => {
const handleScroll: WheelEventHandler<HTMLVideoElement> = (e) => {
if (!cameraMouseDragGuards[cameraControls].zoom.scrollCallback(e)) return
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd: {
type: 'default_camera_zoom',
@ -177,7 +176,7 @@ export const Stream = ({ className = '' }) => {
}
if (!didDragInStream) {
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd: {
type: 'select_with_point',
@ -214,7 +213,7 @@ export const Stream = ({ className = '' }) => {
window: { x, y },
}
}
engineCommandManager?.sendSceneCommand(command).then(async (resp) => {
engineCommandManager.sendSceneCommand(command).then(async (resp) => {
if (!(guiMode.mode === 'sketch')) return
if (guiMode.sketchMode === 'selectFace') return
@ -240,9 +239,6 @@ export const Stream = ({ className = '' }) => {
) {
// Let's get the updated ast.
if (sketchGroupId === '') return
console.log('guiMode.pathId', guiMode.pathId)
// We have a problem if we do not have an id for the sketch group.
if (
guiMode.pathId === undefined ||
@ -285,7 +281,7 @@ export const Stream = ({ className = '' }) => {
guiMode.waitingFirstClick &&
!isEditingExistingSketch
) {
const curve = await engineCommandManager?.sendSceneCommand({
const curve = await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
@ -326,7 +322,7 @@ export const Stream = ({ className = '' }) => {
resp?.data?.data?.entities_modified?.length &&
(!guiMode.waitingFirstClick || isEditingExistingSketch)
) {
const curve = await engineCommandManager?.sendSceneCommand({
const curve = await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
@ -371,12 +367,12 @@ export const Stream = ({ className = '' }) => {
setGuiMode({
mode: 'default',
})
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: { type: 'edit_mode_exit' },
})
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: { type: 'default_camera_disable_sketch_mode' },

View File

@ -30,6 +30,7 @@ import { isOverlap, roundOff } from 'lib/utils'
import { kclErrToDiagnostic } from 'lang/errors'
import { CSSRuleObject } from 'tailwindcss/types/config'
import interact from '@replit/codemirror-interact'
import { engineCommandManager } from '../lang/std/engineConnection'
export const editorShortcutMeta = {
formatCode: {
@ -52,7 +53,6 @@ export const TextEditor = ({
code,
deferredSetCode,
editorView,
engineCommandManager,
formatCode,
isLSPServerReady,
selectionRanges,
@ -64,7 +64,6 @@ export const TextEditor = ({
code: s.code,
deferredSetCode: s.deferredSetCode,
editorView: s.editorView,
engineCommandManager: s.engineCommandManager,
formatCode: s.formatCode,
isLSPServerReady: s.isLSPServerReady,
selectionRanges: s.selectionRanges,
@ -173,7 +172,7 @@ export const TextEditor = ({
const idBasedSelections = codeBasedSelections
.map(({ type, range }) => {
const hasOverlap = Object.entries(
engineCommandManager?.sourceRangeMap || {}
engineCommandManager.sourceRangeMap || {}
).filter(([_, sourceRange]) => {
return isOverlap(sourceRange, range)
})
@ -186,7 +185,7 @@ export const TextEditor = ({
})
.filter(Boolean) as any
engineCommandManager?.cusorsSelected({
engineCommandManager.cusorsSelected({
otherSelections: [],
idBasedSelections,
})

View File

@ -133,7 +133,7 @@ export const SetAbsDistance = ({ buttonType }: { buttonType: ButtonType }) => {
callBack: updateCursors(setCursor, selectionRanges, pathToNodeMap),
})
} catch (e) {
console.log('e', e)
console.log('error', e)
}
}}
disabled={!enableAngLen}

View File

@ -147,7 +147,7 @@ export const SetAngleLength = ({
callBack: updateCursors(setCursor, selectionRanges, pathToNodeMap),
})
} catch (e) {
console.log('e', e)
console.log('erorr', e)
}
}}
disabled={!enableAngLen}

View File

@ -109,7 +109,6 @@ export default class Client extends jsrpc.JSONRPCServerAndClient {
}
}
messageString += message
// console.log(messageString)
return
})

View File

@ -13,6 +13,7 @@ import {
CompletionItemKind,
CompletionTriggerKind,
} from 'vscode-languageserver-protocol'
import debounce from 'debounce-promise'
import type {
Completion,
@ -53,14 +54,11 @@ export class LanguageServerPlugin implements PluginValue {
private languageId: string
private documentVersion: number
private changesTimeout: number
constructor(private view: EditorView, private allowHTMLContent: boolean) {
this.client = this.view.state.facet(client)
this.documentUri = this.view.state.facet(documentUri)
this.languageId = this.view.state.facet(languageId)
this.documentVersion = 0
this.changesTimeout = 0
this.client.attachPlugin(this)
@ -71,12 +69,10 @@ export class LanguageServerPlugin implements PluginValue {
update({ docChanged }: ViewUpdate) {
if (!docChanged) return
if (this.changesTimeout) clearTimeout(this.changesTimeout)
this.changesTimeout = window.setTimeout(() => {
this.sendChange({
documentText: this.view.state.doc.toString(),
})
}, changesDelay)
this.sendChange({
documentText: this.view.state.doc.toString(),
})
}
destroy() {
@ -99,14 +95,32 @@ export class LanguageServerPlugin implements PluginValue {
async sendChange({ documentText }: { documentText: string }) {
if (!this.client.ready) return
if (documentText.length > 5000) {
// Clear out the text it thinks we have, large documents will throw a stack error.
// This is obviously not a good fix but it works for now til we figure
// out the stack limits in wasm and also rewrite the parser.
// Since this is only for hover and completions it will be fine,
// completions will still work for stdlib but hover will not.
// That seems like a fine trade-off for a working editor for the time
// being.
documentText = ''
}
try {
await this.client.textDocumentDidChange({
textDocument: {
uri: this.documentUri,
version: this.documentVersion++,
debounce(
() => {
return this.client.textDocumentDidChange({
textDocument: {
uri: this.documentUri,
version: this.documentVersion++,
},
contentChanges: [{ text: documentText }],
})
},
contentChanges: [{ text: documentText }],
})
changesDelay,
{ leading: true }
)
} catch (e) {
console.error(e)
}

View File

@ -8,6 +8,7 @@ import { ArtifactMap, EngineCommandManager } from 'lang/std/engineConnection'
import { Models } from '@kittycad/lib/dist/types/src'
import { isReducedMotion } from 'lang/util'
import { isOverlap } from 'lib/utils'
import { engineCommandManager } from '../lang/std/engineConnection'
interface DefaultPlanes {
xy: string
@ -17,19 +18,13 @@ interface DefaultPlanes {
}
export function useAppMode() {
const {
guiMode,
setGuiMode,
selectionRanges,
engineCommandManager,
selectionRangeTypeMap,
} = useStore((s) => ({
guiMode: s.guiMode,
setGuiMode: s.setGuiMode,
selectionRanges: s.selectionRanges,
engineCommandManager: s.engineCommandManager,
selectionRangeTypeMap: s.selectionRangeTypeMap,
}))
const { guiMode, setGuiMode, selectionRanges, selectionRangeTypeMap } =
useStore((s) => ({
guiMode: s.guiMode,
setGuiMode: s.setGuiMode,
selectionRanges: s.selectionRanges,
selectionRangeTypeMap: s.selectionRangeTypeMap,
}))
const [defaultPlanes, setDefaultPlanes] = useState<DefaultPlanes | null>(null)
useEffect(() => {
if (
@ -65,7 +60,7 @@ export function useAppMode() {
setDefaultPlanesHidden(engineCommandManager, localDefaultPlanes, true)
// TODO figure out the plane to use based on the sketch
// maybe it's easier to make a new plane than rely on the defaults
await engineCommandManager?.sendSceneCommand({
await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
@ -135,7 +130,7 @@ export function useAppMode() {
])
useEffect(() => {
const unSub = engineCommandManager?.subscribeTo({
const unSub = engineCommandManager.subscribeTo({
event: 'select_with_point',
callback: async ({ data }) => {
if (!data.entity_id) return
@ -144,18 +139,16 @@ export function useAppMode() {
// user clicked something else in the scene
return
}
const sketchModeResponse = await engineCommandManager?.sendSceneCommand(
{
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
type: 'sketch_mode_enable',
plane_id: data.entity_id,
ortho: true,
animated: !isReducedMotion(),
},
}
)
const sketchModeResponse = await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
type: 'sketch_mode_enable',
plane_id: data.entity_id,
ortho: true,
animated: !isReducedMotion(),
},
})
setDefaultPlanesHidden(engineCommandManager, defaultPlanes, true)
const sketchUuid = uuidv4()
const proms: any[] = []
@ -178,8 +171,7 @@ export function useAppMode() {
},
})
)
const res = await Promise.all(proms)
console.log('res', res)
await Promise.all(proms)
setGuiMode({
mode: 'sketch',
sketchMode: 'sketchEdit',
@ -209,7 +201,7 @@ async function createPlane(
}
) {
const planeId = uuidv4()
await engineCommandManager?.sendSceneCommand({
await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd: {
type: 'make_plane',
@ -221,7 +213,7 @@ async function createPlane(
},
cmd_id: planeId,
})
await engineCommandManager?.sendSceneCommand({
await engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd: {
type: 'plane_set_color',
@ -234,12 +226,12 @@ async function createPlane(
}
function setDefaultPlanesHidden(
engineCommandManager: EngineCommandManager | undefined,
engineCommandManager: EngineCommandManager,
defaultPlanes: DefaultPlanes,
hidden: boolean
) {
Object.values(defaultPlanes).forEach((planeId) => {
engineCommandManager?.sendSceneCommand({
engineCommandManager.sendSceneCommand({
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {

View File

@ -1,14 +1,9 @@
import { useEffect } from 'react'
import { useStore } from 'useStore'
import { engineCommandManager } from '../lang/std/engineConnection'
export function useEngineConnectionSubscriptions() {
const {
engineCommandManager,
setCursor2,
setHighlightRange,
highlightRange,
} = useStore((s) => ({
engineCommandManager: s.engineCommandManager,
const { setCursor2, setHighlightRange, highlightRange } = useStore((s) => ({
setCursor2: s.setCursor2,
setHighlightRange: s.setHighlightRange,
highlightRange: s.highlightRange,

View File

@ -1,53 +1,90 @@
import { useLayoutEffect } from 'react'
import { useLayoutEffect, useEffect, useRef } from 'react'
import { _executor } from '../lang/executor'
import { useStore } from '../useStore'
import { EngineCommandManager } from '../lang/std/engineConnection'
import { engineCommandManager } from '../lang/std/engineConnection'
import { deferExecution } from 'lib/utils'
export function useSetupEngineManager(
streamRef: React.RefObject<HTMLDivElement>,
token?: string
) {
const {
setEngineCommandManager,
setMediaStream,
setIsStreamReady,
setStreamDimensions,
executeCode,
streamDimensions,
} = useStore((s) => ({
setEngineCommandManager: s.setEngineCommandManager,
setMediaStream: s.setMediaStream,
setIsStreamReady: s.setIsStreamReady,
setStreamDimensions: s.setStreamDimensions,
executeCode: s.executeCode,
streamDimensions: s.streamDimensions,
}))
const streamWidth = streamRef?.current?.offsetWidth
const streamHeight = streamRef?.current?.offsetHeight
const hasSetNonZeroDimensions = useRef<boolean>(false)
useLayoutEffect(() => {
// Load the engine command manager once with the initial width and height,
// then we do not want to reload it.
const { width: quadWidth, height: quadHeight } = getDimensions(
streamWidth,
streamHeight
)
if (!hasSetNonZeroDimensions.current && quadHeight && quadWidth) {
engineCommandManager.start({
setMediaStream,
setIsStreamReady,
width: quadWidth,
height: quadHeight,
token,
})
engineCommandManager.waitForReady.then(() => {
executeCode()
})
setStreamDimensions({
streamWidth: quadWidth,
streamHeight: quadHeight,
})
hasSetNonZeroDimensions.current = true
}
}, [streamRef?.current?.offsetWidth, streamRef?.current?.offsetHeight])
useEffect(() => {
const handleResize = deferExecution(() => {
const { width, height } = getDimensions(
streamRef?.current?.offsetWidth,
streamRef?.current?.offsetHeight
)
if (
streamDimensions.streamWidth !== width ||
streamDimensions.streamHeight !== height
) {
engineCommandManager.handleResize({
streamWidth: width,
streamHeight: height,
})
setStreamDimensions({
streamWidth: width,
streamHeight: height,
})
}
}, 500)
window.addEventListener('resize', handleResize)
return () => {
window.removeEventListener('resize', handleResize)
}
}, [])
}
function getDimensions(streamWidth?: number, streamHeight?: number) {
const width = streamWidth ? streamWidth : 0
const quadWidth = Math.round(width / 4) * 4
const height = streamHeight ? streamHeight : 0
const quadHeight = Math.round(height / 4) * 4
useLayoutEffect(() => {
setStreamDimensions({
streamWidth: quadWidth,
streamHeight: quadHeight,
})
if (!width || !height) return
const eng = new EngineCommandManager({
setMediaStream,
setIsStreamReady,
width: quadWidth,
height: quadHeight,
token,
})
setEngineCommandManager(eng)
eng.waitForReady.then(() => {
executeCode()
})
return () => {
eng?.tearDown()
}
}, [quadWidth, quadHeight])
return { width: quadWidth, height: quadHeight }
}

View File

@ -48,7 +48,7 @@ export function useConvertToVariable() {
updateAst(_modifiedAst, true)
} catch (e) {
console.log('e', e)
console.log('error', e)
}
}

View File

@ -1691,7 +1691,6 @@ describe('parsing errors', () => {
let _theError
try {
const result = expect(parser_wasm(code))
console.log('result', result)
} catch (e) {
_theError = e
}

View File

@ -1,6 +1,7 @@
import { getNodePathFromSourceRange, getNodeFromPath } from './queryAst'
import { parser_wasm } from './abstractSyntaxTree'
import { initPromise } from './rust'
import { Identifier } from './abstractSyntaxTreeTypes'
beforeAll(() => initPromise)
@ -27,4 +28,81 @@ const sk3 = startSketchAt([0, 0])
expect([node.start, node.end]).toEqual(sourceRange)
expect(node.type).toBe('CallExpression')
})
it('gets path right for function definition params', () => {
const code = `fn cube = (pos, scale) => {
const sg = startSketchAt(pos)
|> line([0, scale], %)
|> line([scale, 0], %)
|> line([0, -scale], %)
return sg
}
const b1 = cube([0,0], 10)`
const subStr = 'pos, scale'
const subStrIndex = code.indexOf(subStr)
const sourceRange: [number, number] = [
subStrIndex,
subStrIndex + 'pos'.length,
]
const ast = parser_wasm(code)
const nodePath = getNodePathFromSourceRange(ast, sourceRange)
const node = getNodeFromPath<Identifier>(ast, nodePath).node
expect(nodePath).toEqual([
['body', ''],
[0, 'index'],
['declarations', 'VariableDeclaration'],
[0, 'index'],
['init', ''],
['params', 'FunctionExpression'],
[0, 'index'],
])
expect(node.type).toBe('Identifier')
expect(node.name).toBe('pos')
})
it('gets path right for deep within function definition body', () => {
const code = `fn cube = (pos, scale) => {
const sg = startSketchAt(pos)
|> line([0, scale], %)
|> line([scale, 0], %)
|> line([0, -scale], %)
return sg
}
const b1 = cube([0,0], 10)`
const subStr = 'scale, 0'
const subStrIndex = code.indexOf(subStr)
const sourceRange: [number, number] = [
subStrIndex,
subStrIndex + 'scale'.length,
]
const ast = parser_wasm(code)
const nodePath = getNodePathFromSourceRange(ast, sourceRange)
const node = getNodeFromPath<Identifier>(ast, nodePath).node
expect(nodePath).toEqual([
['body', ''],
[0, 'index'],
['declarations', 'VariableDeclaration'],
[0, 'index'],
['init', ''],
['body', 'FunctionExpression'],
['body', 'FunctionExpression'],
[0, 'index'],
['declarations', 'VariableDeclaration'],
[0, 'index'],
['init', ''],
['body', 'PipeExpression'],
[2, 'index'],
['arguments', 'CallExpression'],
[0, 'index'],
['elements', 'ArrayExpression'],
[0, 'index'],
])
expect(node.type).toBe('Identifier')
expect(node.name).toBe('scale')
})
})

View File

@ -321,7 +321,7 @@ export function extrudeSketch(
[0, 'index'],
]
return {
modifiedAst: addToShow(_node, name),
modifiedAst: node,
pathToNode: [...pathToNode.slice(0, -1), [showCallIndex, 'index']],
pathToExtrudeArg,
}

View File

@ -239,7 +239,29 @@ function moreNodePathFromSourceRange(
}
return path
}
console.error('not implemented')
if (_node.type === 'FunctionExpression' && isInRange) {
for (let i = 0; i < _node.params.length; i++) {
const param = _node.params[i]
if (param.start <= start && param.end >= end) {
path.push(['params', 'FunctionExpression'])
path.push([i, 'index'])
return moreNodePathFromSourceRange(param, sourceRange, path)
}
}
if (_node.body.start <= start && _node.body.end >= end) {
path.push(['body', 'FunctionExpression'])
const fnBody = _node.body.body
for (let i = 0; i < fnBody.length; i++) {
const statement = fnBody[i]
if (statement.start <= start && statement.end >= end) {
path.push(['body', 'FunctionExpression'])
path.push([i, 'index'])
return moreNodePathFromSourceRange(statement, sourceRange, path)
}
}
}
}
console.error('not implemented: ' + node.type)
return path
}

View File

@ -7,7 +7,7 @@ export const recast = (ast: Program): string => {
return s
} catch (e) {
// TODO: do something real with the error.
console.log('recast', e)
console.log('recast error', e)
throw e
}
}

View File

@ -23,6 +23,10 @@ interface ResultCommand extends CommandInfo {
data: any
raw: WebSocketResponse
}
interface FailedCommand extends CommandInfo {
type: 'failed'
errors: Models['FailureWebSocketResponse_type']['errors']
}
interface PendingCommand extends CommandInfo {
type: 'pending'
promise: Promise<any>
@ -30,7 +34,7 @@ interface PendingCommand extends CommandInfo {
}
export interface ArtifactMap {
[key: string]: ResultCommand | PendingCommand
[key: string]: ResultCommand | PendingCommand | FailedCommand
}
export interface SourceRangeMap {
[key: string]: SourceRange
@ -41,6 +45,11 @@ interface NewTrackArgs {
mediaStream: MediaStream
}
// This looks funny, I know. This is needed because node and the browser
// disagree as to the type. In a browser it's a number, but in node it's a
// "Timeout".
type Timeout = ReturnType<typeof setTimeout>
type ClientMetrics = Models['ClientMetrics_type']
// EngineConnection encapsulates the connection(s) to the Engine
@ -52,6 +61,9 @@ export class EngineConnection {
unreliableDataChannel?: RTCDataChannel
private ready: boolean
private connecting: boolean
private dead: boolean
private failedConnTimeout: Timeout | null
readonly url: string
private readonly token?: string
@ -87,6 +99,9 @@ export class EngineConnection {
this.url = url
this.token = token
this.ready = false
this.connecting = false
this.dead = false
this.failedConnTimeout = null
this.onWebsocketOpen = onWebsocketOpen
this.onDataChannelOpen = onDataChannelOpen
this.onEngineConnectionOpen = onEngineConnectionOpen
@ -97,7 +112,10 @@ export class EngineConnection {
// TODO(paultag): This ought to be tweakable.
const pingIntervalMs = 10000
setInterval(() => {
let pingInterval = setInterval(() => {
if (this.dead) {
clearInterval(pingInterval)
}
if (this.isReady()) {
// When we're online, every 10 seconds, we'll attempt to put a 'ping'
// command through the WebSocket connection. This will help both ends
@ -106,6 +124,24 @@ export class EngineConnection {
this.send({ type: 'ping' })
}
}, pingIntervalMs)
const connectionTimeoutMs = VITE_KC_CONNECTION_TIMEOUT_MS
let connectInterval = setInterval(() => {
if (this.dead) {
clearInterval(connectInterval)
return
}
if (this.isReady()) {
return
}
console.log('connecting via retry')
this.connect()
}, connectionTimeoutMs)
}
// isConnecting will return true when connect has been called, but the full
// WebRTC is not online.
isConnecting() {
return this.connecting
}
// isReady will return true only when the WebRTC *and* WebSocket connection
// are connected. During setup, the WebSocket connection comes online first,
@ -114,6 +150,10 @@ export class EngineConnection {
isReady() {
return this.ready
}
tearDown() {
this.dead = true
this.close()
}
// shouldTrace will return true when Sentry should be used to instrument
// the Engine.
shouldTrace() {
@ -125,8 +165,10 @@ export class EngineConnection {
// This will attempt the full handshake, and retry if the connection
// did not establish.
connect() {
// TODO(paultag): make this safe to call multiple times, and figure out
// when a connection is in progress (state: connecting or something).
console.log('connect was called')
if (this.isConnecting() || this.isReady()) {
return
}
// Information on the connect transaction
@ -358,20 +400,6 @@ export class EngineConnection {
})
})
}
// TODO(paultag): This ought to be both controllable, as well as something
// like exponential backoff to have some grace on the backend, as well as
// fix responsiveness for clients that had a weird network hiccup.
const connectionTimeoutMs = VITE_KC_CONNECTION_TIMEOUT_MS
setTimeout(() => {
if (this.isReady()) {
return
}
console.log('engine connection timeout on connection, retrying')
this.close()
this.connect()
}, connectionTimeoutMs)
})
this.pc.addEventListener('track', (event) => {
@ -461,6 +489,7 @@ export class EngineConnection {
this.onEngineConnectionOpen(this)
this.ready = true
this.connecting = false
})
this.unreliableDataChannel.addEventListener('close', (event) => {
@ -474,6 +503,22 @@ export class EngineConnection {
})
})
const connectionTimeoutMs = VITE_KC_CONNECTION_TIMEOUT_MS
if (this.failedConnTimeout) {
console.log('clearing timeout before set')
clearTimeout(this.failedConnTimeout)
this.failedConnTimeout = null
}
console.log('timeout set')
this.failedConnTimeout = setTimeout(() => {
if (this.isReady()) {
return
}
console.log('engine connection timeout on connection, closing')
this.close()
}, connectionTimeoutMs)
this.onConnectionStarted(this)
}
unreliableSend(message: object | string) {
@ -498,9 +543,15 @@ export class EngineConnection {
this.pc = undefined
this.unreliableDataChannel = undefined
this.webrtcStatsCollector = undefined
if (this.failedConnTimeout) {
console.log('closed timeout in close')
clearTimeout(this.failedConnTimeout)
this.failedConnTimeout = null
}
this.onClose(this)
this.ready = false
this.connecting = false
}
}
@ -544,7 +595,12 @@ export class EngineCommandManager {
[localUnsubscribeId: string]: (a: any) => void
}
} = {} as any
constructor({
constructor() {
this.engineConnection = undefined
}
start({
setMediaStream,
setIsStreamReady,
width,
@ -557,6 +613,16 @@ export class EngineCommandManager {
height: number
token?: string
}) {
if (width === 0 || height === 0) {
return
}
// If we already have an engine connection, just need to resize the stream.
if (this.engineConnection) {
this.handleResize({ streamWidth: width, streamHeight: height })
return
}
this.waitForReady = new Promise((resolve) => {
this.resolveReady = resolve
})
@ -617,6 +683,8 @@ export class EngineCommandManager {
message.request_id
) {
this.handleModelingCommand(message.resp, message.request_id)
} else if (!message.success && message.request_id) {
this.handleFailedModelingCommand(message)
}
}
})
@ -636,7 +704,35 @@ export class EngineCommandManager {
this.engineConnection?.connect()
}
handleResize({
streamWidth,
streamHeight,
}: {
streamWidth: number
streamHeight: number
}) {
console.log('handleResize', streamWidth, streamHeight)
if (!this.engineConnection?.isReady()) {
return
}
const resizeCmd: EngineCommand = {
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
type: 'reconfigure_stream',
width: streamWidth,
height: streamHeight,
fps: 60,
},
}
this.engineConnection?.send(resizeCmd)
}
handleModelingCommand(message: WebSocketResponse, id: string) {
if (this.engineConnection === undefined) {
return
}
if (message.type !== 'modeling') {
return
}
@ -673,8 +769,40 @@ export class EngineCommandManager {
}
}
}
handleFailedModelingCommand({
request_id,
errors,
}: Models['FailureWebSocketResponse_type']) {
const id = request_id
if (!id) return
const command = this.artifactMap[id]
if (command && command.type === 'pending') {
const resolve = command.resolve
this.artifactMap[id] = {
type: 'failed',
range: command.range,
commandType: command.commandType,
parentId: command.parentId ? command.parentId : undefined,
errors,
}
resolve({
id,
commandType: command.commandType,
range: command.range,
errors,
})
} else {
this.artifactMap[id] = {
type: 'failed',
range: command.range,
commandType: command.commandType,
parentId: command.parentId ? command.parentId : undefined,
errors,
}
}
}
tearDown() {
this.engineConnection?.close()
this.engineConnection?.tearDown()
}
startNewSession() {
this.artifactMap = {}
@ -769,6 +897,9 @@ export class EngineCommandManager {
})
}
sendSceneCommand(command: EngineCommand): Promise<any> {
if (this.engineConnection === undefined) {
return Promise.resolve()
}
if (
command.type === 'modeling_cmd_req' &&
command.cmd.type !== lastMessage
@ -777,7 +908,6 @@ export class EngineCommandManager {
lastMessage = command.cmd.type
}
if (!this.engineConnection?.isReady()) {
console.log('socket not ready')
return Promise.resolve()
}
if (command.type !== 'modeling_cmd_req') return Promise.resolve()
@ -821,10 +951,12 @@ export class EngineCommandManager {
range: SourceRange
command: EngineCommand | string
}): Promise<any> {
if (this.engineConnection === undefined) {
return Promise.resolve()
}
this.sourceRangeMap[id] = range
if (!this.engineConnection?.isReady()) {
console.log('socket not ready')
return Promise.resolve()
}
this.engineConnection?.send(command)
@ -867,6 +999,9 @@ export class EngineCommandManager {
rangeStr: string,
commandStr: string
): Promise<any> {
if (this.engineConnection === undefined) {
return Promise.resolve()
}
if (id === undefined) {
throw new Error('id is undefined')
}
@ -890,6 +1025,8 @@ export class EngineCommandManager {
}
if (command.type === 'result') {
return command.data
} else if (command.type === 'failed') {
return Promise.resolve(command.errors)
}
return command.promise
}
@ -915,6 +1052,9 @@ export class EngineCommandManager {
}
}
private async fixIdMappings(ast: Program, programMemory: ProgramMemory) {
if (this.engineConnection === undefined) {
return
}
/* This is a temporary solution since the cmd_ids that are sent through when
sending 'extend_path' ids are not used as the segment ids.
@ -994,3 +1134,5 @@ export class EngineCommandManager {
})
}
}
export const engineCommandManager = new EngineCommandManager()

View File

@ -1279,7 +1279,7 @@ export function getTransformInfos(
}) as TransformInfo[]
return theTransforms
} catch (error) {
console.log(error)
console.log('error', error)
return []
}
}

View File

@ -11,7 +11,7 @@ export async function asyncLexer(str: string): Promise<Token[]> {
return tokens
} catch (e) {
// TODO: do something real with the error.
console.log('lexer', e)
console.log('lexer error', e)
throw e
}
}
@ -22,7 +22,7 @@ export function lexer(str: string): Token[] {
return tokens
} catch (e) {
// TODO: do something real with the error.
console.log('lexer', e)
console.log('lexer error', e)
throw e
}
}

View File

@ -39,6 +39,6 @@ export async function exportSave(data: ArrayBuffer) {
}
} catch (e) {
// TODO: do something real with the error.
console.log('export', e)
console.log('export error', e)
}
}

View File

@ -36,7 +36,7 @@ export async function initializeProjectDirectory(directory: string) {
try {
docDirectory = await documentDir()
} catch (e) {
console.log(e)
console.log('error', e)
docDirectory = await homeDir() // seems to work better on Linux
}

View File

@ -5,6 +5,9 @@ import {
EngineCommand,
} from '../lang/std/engineConnection'
import { SourceRange } from 'lang/executor'
import { Models } from '@kittycad/lib'
type WebSocketResponse = Models['OkWebSocketResponseData_type']
class MockEngineCommandManager {
constructor(mockParams: {
@ -23,7 +26,13 @@ class MockEngineCommandManager {
range: SourceRange
command: EngineCommand
}): Promise<any> {
return Promise.resolve()
const response: WebSocketResponse = {
type: 'modeling',
data: {
modeling_response: { type: 'empty' },
},
}
return Promise.resolve(JSON.stringify(response))
}
sendModelingCommandFromWasm(
id: string,
@ -66,11 +75,12 @@ export async function executor(
ast: Program,
pm: ProgramMemory = { root: {}, return: null }
): Promise<ProgramMemory> {
const engineCommandManager = new EngineCommandManager({
const engineCommandManager = new EngineCommandManager()
engineCommandManager.start({
setIsStreamReady: () => {},
setMediaStream: () => {},
width: 100,
height: 100,
width: 0,
height: 0,
})
await engineCommandManager.waitForReady
engineCommandManager.startNewSession()

View File

@ -19,6 +19,7 @@ import { KCLError } from './lang/errors'
import { deferExecution } from 'lib/utils'
import { _executor } from './lang/executor'
import { bracket } from 'lib/exampleKcl'
import { engineCommandManager } from './lang/std/engineConnection'
export type Selection = {
type: 'default' | 'line-end' | 'line-mid'
@ -162,8 +163,6 @@ export interface StoreState {
setProgramMemory: (programMemory: ProgramMemory) => void
isShiftDown: boolean
setIsShiftDown: (isShiftDown: boolean) => void
engineCommandManager?: EngineCommandManager
setEngineCommandManager: (engineCommandManager: EngineCommandManager) => void
mediaStream?: MediaStream
setMediaStream: (mediaStream: MediaStream) => void
isStreamReady: boolean
@ -226,7 +225,7 @@ export const useStore = create<StoreState>()(
const result = await executeCode({
code: code || get().code,
lastAst: get().ast,
engineCommandManager: get().engineCommandManager,
engineCommandManager: engineCommandManager,
})
if (!result.isChange) {
return
@ -332,8 +331,6 @@ export const useStore = create<StoreState>()(
executeAst: async (ast) => {
const _ast = ast || get().ast
if (!get().isStreamReady) return
const engineCommandManager = get().engineCommandManager!
if (!engineCommandManager) return
set({ isExecuting: true })
const { logs, errors, programMemory } = await executeAst({
@ -350,8 +347,6 @@ export const useStore = create<StoreState>()(
executeAstMock: async (ast) => {
const _ast = ast || get().ast
if (!get().isStreamReady) return
const engineCommandManager = get().engineCommandManager!
if (!engineCommandManager) return
const { logs, errors, programMemory } = await executeAst({
ast: _ast,
@ -435,8 +430,6 @@ export const useStore = create<StoreState>()(
setProgramMemory: (programMemory) => set({ programMemory }),
isShiftDown: false,
setIsShiftDown: (isShiftDown) => set({ isShiftDown }),
setEngineCommandManager: (engineCommandManager) =>
set({ engineCommandManager }),
setMediaStream: (mediaStream) => set({ mediaStream }),
isStreamReady: false,
setIsStreamReady: (isStreamReady) => set({ isStreamReady }),
@ -454,7 +447,9 @@ export const useStore = create<StoreState>()(
fileId: '',
setFileId: (fileId) => set({ fileId }),
streamDimensions: { streamWidth: 1280, streamHeight: 720 },
setStreamDimensions: (streamDimensions) => set({ streamDimensions }),
setStreamDimensions: (streamDimensions) => {
set({ streamDimensions })
},
isExecuting: false,
setIsExecuting: (isExecuting) => set({ isExecuting }),
@ -519,7 +514,7 @@ async function executeCode({
}: {
code: string
lastAst: Program
engineCommandManager?: EngineCommandManager
engineCommandManager: EngineCommandManager
}): Promise<
| {
logs: string[]
@ -539,7 +534,7 @@ async function executeCode({
if (e instanceof KCLError) {
errors = [e]
logs = []
if (e.msg === 'file is empty') engineCommandManager?.endSession()
if (e.msg === 'file is empty') engineCommandManager.endSession()
}
return {
isChange: true,
@ -562,7 +557,7 @@ async function executeCode({
}
// Check if the ast we have is equal to the ast in the storage.
// If it is, we don't need to update the ast.
if (!engineCommandManager || JSON.stringify(ast) === JSON.stringify(lastAst))
if (JSON.stringify(ast) === JSON.stringify(lastAst))
return { isChange: false }
const { logs, errors, programMemory } = await executeAst({

210
src/wasm-lib/Cargo.lock generated
View File

@ -63,6 +63,12 @@ dependencies = [
"libc",
]
[[package]]
name = "anes"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4b46cbb362ab8752921c97e041f5e366ee6297bd428a31275b9fcf1e380f7299"
[[package]]
name = "anstream"
version = "0.5.0"
@ -142,6 +148,17 @@ dependencies = [
"thiserror",
]
[[package]]
name = "async-recursion"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5fd55a5ba1179988837d24ab4c7cc8ed6efdeff578ede0416b4225a5fca35bd0"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.37",
]
[[package]]
name = "async-trait"
version = "0.1.73"
@ -211,13 +228,16 @@ checksum = "9ba43ea6f343b788c8764558649e08df62f86c6ef251fdaeb1ffd010a9ae50a2"
[[package]]
name = "bigdecimal"
version = "0.3.1"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a6773ddc0eafc0e509fb60e48dff7f450f8e674a0686ae8605e8d9901bd5eefa"
checksum = "454bca3db10617b88b566f205ed190aedb0e0e6dd4cad61d3988a72e8c5594cb"
dependencies = [
"autocfg",
"libm",
"num-bigint",
"num-integer",
"num-traits",
"serde",
]
[[package]]
@ -337,6 +357,12 @@ dependencies = [
"serde",
]
[[package]]
name = "cast"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "37b2a672a2cb129a2e41c10b1224bb368f9f37a2b16b612598138befd7b37eb5"
[[package]]
name = "cc"
version = "1.0.83"
@ -374,6 +400,33 @@ dependencies = [
"windows-targets 0.48.5",
]
[[package]]
name = "ciborium"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "effd91f6c78e5a4ace8a5d3c0b6bfaec9e2baaef55f3efc00e45fb2e477ee926"
dependencies = [
"ciborium-io",
"ciborium-ll",
"serde",
]
[[package]]
name = "ciborium-io"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cdf919175532b369853f5d5e20b26b43112613fd6fe7aee757e35f7a44642656"
[[package]]
name = "ciborium-ll"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "defaa24ecc093c77630e6c15e17c51f5e187bf35ee514f4e2d67baaa96dae22b"
dependencies = [
"ciborium-io",
"half 1.8.2",
]
[[package]]
name = "clang-sys"
version = "1.6.1"
@ -507,6 +560,42 @@ dependencies = [
"cfg-if",
]
[[package]]
name = "criterion"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f2b12d017a929603d80db1831cd3a24082f8137ce19c69e6447f54f5fc8d692f"
dependencies = [
"anes",
"cast",
"ciborium",
"clap",
"criterion-plot",
"is-terminal",
"itertools 0.10.5",
"num-traits",
"once_cell",
"oorandom",
"plotters",
"rayon",
"regex",
"serde",
"serde_derive",
"serde_json",
"tinytemplate",
"walkdir",
]
[[package]]
name = "criterion-plot"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6b50826342786a51a89e2da3a28f1c32b06e387201bc2d19791f622c673706b1"
dependencies = [
"cast",
"itertools 0.10.5",
]
[[package]]
name = "crossbeam-channel"
version = "0.5.8"
@ -716,7 +805,7 @@ checksum = "832a761f35ab3e6664babfbdc6cef35a4860e816ec3916dcfd0882954e98a8a8"
dependencies = [
"bit_field",
"flume",
"half",
"half 2.2.1",
"lebe",
"miniz_oxide",
"rayon-core",
@ -983,6 +1072,12 @@ dependencies = [
"tracing",
]
[[package]]
name = "half"
version = "1.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eabb4a44450da02c90444cf74558da904edde8fb4e9035a9a6a4e15445af0bd7"
[[package]]
name = "half"
version = "2.2.1"
@ -1281,12 +1376,14 @@ dependencies = [
[[package]]
name = "kcl-lib"
version = "0.1.30"
version = "0.1.31"
dependencies = [
"anyhow",
"async-recursion",
"async-trait",
"bson",
"clap",
"criterion",
"dashmap",
"derive-docs",
"expectorate",
@ -1297,7 +1394,6 @@ dependencies = [
"lazy_static",
"parse-display",
"pretty_assertions",
"regex",
"reqwest",
"schemars",
"serde",
@ -1311,15 +1407,19 @@ dependencies = [
"wasm-bindgen",
"wasm-bindgen-futures",
"web-sys",
"winnow",
]
[[package]]
name = "kittycad"
version = "0.2.25"
version = "0.2.26"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e2623ee601ce203476229df3f9d3a14664cb43e3f7455e9ac8ed91aacaa6163d"
dependencies = [
"anyhow",
"async-trait",
"base64 0.21.4",
"bigdecimal",
"bytes",
"chrono",
"data-encoding",
@ -1381,6 +1481,12 @@ dependencies = [
"winapi",
]
[[package]]
name = "libm"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7012b1bbb0719e1097c47611d3898568c546d597c2e74d66f6087edd5233ff4"
[[package]]
name = "linked-hash-map"
version = "0.5.6"
@ -1604,10 +1710,16 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "44d11de466f4a3006fe8a5e7ec84e93b79c70cb992ae0aa0eb631ad2df8abfe2"
[[package]]
name = "oorandom"
version = "11.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0ab1bc2a289d34bd04a330323ac98a1b4bc82c9d9fcb1e66b63caa84da26b575"
[[package]]
name = "openapitor"
version = "0.0.9"
source = "git+https://github.com/KittyCAD/kittycad.rs?branch=main#0d121f6881da91b4a30bee18bbfe50e4a2096073"
source = "git+https://github.com/KittyCAD/kittycad.rs?branch=main#61a16059b3eaf8793a2a2e1edbc0d770f284fea3"
dependencies = [
"Inflector",
"anyhow",
@ -1781,14 +1893,14 @@ dependencies = [
[[package]]
name = "phonenumber"
version = "0.3.2+8.13.9"
version = "0.3.3+8.13.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34749f64ea9d76f10cdc8a859588b57775f59177c7dd91f744d620bd62982d6f"
checksum = "635f3e6288e4f01c049d89332a031bd74f25d64b6fb94703ca966e819488cd06"
dependencies = [
"bincode",
"either",
"fnv",
"itertools 0.10.5",
"itertools 0.11.0",
"lazy_static",
"nom",
"quick-xml",
@ -1796,6 +1908,7 @@ dependencies = [
"regex-cache",
"serde",
"serde_derive",
"strum",
"thiserror",
]
@ -1837,6 +1950,34 @@ version = "0.3.27"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "26072860ba924cbfa98ea39c8c19b4dd6a4a25423dbdf219c1eca91aa0cf6964"
[[package]]
name = "plotters"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d2c224ba00d7cadd4d5c660deaf2098e5e80e07846537c51f9cfa4be50c1fd45"
dependencies = [
"num-traits",
"plotters-backend",
"plotters-svg",
"wasm-bindgen",
"web-sys",
]
[[package]]
name = "plotters-backend"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9e76628b4d3a7581389a35d5b6e2139607ad7c75b17aed325f210aa91f4a9609"
[[package]]
name = "plotters-svg"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38f6d39893cca0701371e3c27294f09797214b86f1fb951b89ade8ec04e2abab"
dependencies = [
"plotters-backend",
]
[[package]]
name = "png"
version = "0.17.10"
@ -2694,6 +2835,28 @@ dependencies = [
"syn 2.0.37",
]
[[package]]
name = "strum"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "063e6045c0e62079840579a7e47a355ae92f60eb74daaf156fb1e84ba164e63f"
dependencies = [
"strum_macros",
]
[[package]]
name = "strum_macros"
version = "0.24.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e385be0d24f186b4ce2f9982191e7101bb737312ad61c1f2f984f34bcf85d59"
dependencies = [
"heck",
"proc-macro2",
"quote",
"rustversion",
"syn 1.0.109",
]
[[package]]
name = "syn"
version = "1.0.109"
@ -2851,6 +3014,16 @@ dependencies = [
"time-core",
]
[[package]]
name = "tinytemplate"
version = "1.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "be4d6b5f19ff7664e8c98d03e2139cb510db9b0a60b55f8e8709b689d939b6bc"
dependencies = [
"serde",
"serde_json",
]
[[package]]
name = "tinyvec"
version = "1.6.0"
@ -2908,9 +3081,9 @@ dependencies = [
[[package]]
name = "tokio-tungstenite"
version = "0.20.0"
version = "0.20.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2b2dbec703c26b00d74844519606ef15d09a7d6857860f84ad223dec002ddea2"
checksum = "212d5dcb2a1ce06d81107c3d0ffa3121fe974b73f068c8282cb1c32328113b6c"
dependencies = [
"futures-util",
"log",
@ -3130,9 +3303,9 @@ dependencies = [
[[package]]
name = "tungstenite"
version = "0.20.0"
version = "0.20.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e862a1c4128df0112ab625f55cd5c934bcb4312ba80b39ae4b4835a3fd58e649"
checksum = "9e3dac10fd62eaf6617d3a904ae222845979aec67c615d1c842b4002c7666fb9"
dependencies = [
"byteorder",
"bytes",
@ -3619,6 +3792,15 @@ version = "0.48.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538"
[[package]]
name = "winnow"
version = "0.5.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c2e3184b9c4e92ad5167ca73039d0c42476302ab603e2fec4487511f38ccefc"
dependencies = [
"memchr",
]
[[package]]
name = "winreg"
version = "0.50.0"

View File

@ -11,8 +11,7 @@ crate-type = ["cdylib"]
bson = { version = "2.7.0", features = ["uuid-1", "chrono"] }
gloo-utils = "0.2.0"
kcl-lib = { path = "kcl" }
#kittycad = { version = "0.2.25", default-features = false, features = ["js"] }
kittycad = { git = "https://github.com/KittyCAD/kittycad.rs", branch = "achalmers/relative-path-segments" }
kittycad = { version = "0.2.25", default-features = false, features = ["js"] }
serde_json = "1.0.107"
uuid = { version = "1.4.1", features = ["v4", "js", "serde"] }
wasm-bindgen = "0.2.87"
@ -21,8 +20,7 @@ wasm-bindgen-futures = "0.4.37"
[dev-dependencies]
anyhow = "1"
image = "0.24.7"
#kittycad = "0.2.25"
kittycad = { git = "https://github.com/KittyCAD/kittycad.rs", branch = "achalmers/relative-path-segments" }
kittycad = "0.2.25"
pretty_assertions = "1.4.0"
reqwest = { version = "0.11.20", default-features = false }
tokio = { version = "1.32.0", features = ["rt-multi-thread", "macros", "time"] }

View File

@ -79,13 +79,6 @@ fn do_stdlib_inner(
));
}
if ast.sig.asyncness.is_some() {
errors.push(Error::new_spanned(
&ast.sig.fn_token,
"stdlib functions must not be async",
));
}
if ast.sig.unsafety.is_some() {
errors.push(Error::new_spanned(
&ast.sig.unsafety,
@ -118,6 +111,7 @@ fn do_stdlib_inner(
let fn_name = &ast.sig.ident;
let fn_name_str = fn_name.to_string().replace("inner_", "");
let fn_name_ident = format_ident!("{}", fn_name_str);
let boxed_fn_name_ident = format_ident!("boxed_{}", fn_name_str);
let _visibility = &ast.vis;
let (summary_text, description_text) = extract_doc_from_attrs(&ast.attrs);
@ -204,7 +198,10 @@ fn do_stdlib_inner(
syn::FnArg::Typed(pat) => pat.ty.as_ref().into_token_stream(),
};
let ty_string = ty.to_string().replace('&', "").replace("mut", "").replace(' ', "");
let mut ty_string = ty.to_string().replace('&', "").replace("mut", "").replace(' ', "");
if ty_string.starts_with("Args") {
ty_string = "Args".to_string();
}
let ty_string = ty_string.trim().to_string();
let ty_ident = if ty_string.starts_with("Vec<") {
let ty_string = ty_string.trim_start_matches("Vec<").trim_end_matches('>');
@ -305,6 +302,14 @@ fn do_stdlib_inner(
#description_doc_comment
#const_struct
fn #boxed_fn_name_ident(
args: crate::std::Args,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>>>,
> {
Box::pin(#fn_name_ident(args))
}
impl #docs_crate::StdLibFn for #name_ident
{
fn name(&self) -> String {
@ -348,7 +353,7 @@ fn do_stdlib_inner(
}
fn std_lib_fn(&self) -> crate::std::StdFn {
#fn_name_ident
#boxed_fn_name_ident
}
fn clone_box(&self) -> Box<dyn #docs_crate::StdLibFn> {

View File

@ -7,6 +7,18 @@ pub(crate) struct Show {}
#[allow(non_upper_case_globals, missing_docs)]
#[doc = "Std lib function: show"]
pub(crate) const Show: Show = Show {};
fn boxed_show(
args: crate::std::Args,
) -> std::pin::Pin<
Box<
dyn std::future::Future<
Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>,
>,
>,
> {
Box::pin(show(args))
}
impl crate::docs::StdLibFn for Show {
fn name(&self) -> String {
"show".to_string()
@ -57,7 +69,7 @@ impl crate::docs::StdLibFn for Show {
}
fn std_lib_fn(&self) -> crate::std::StdFn {
show
boxed_show
}
fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> {

View File

@ -7,6 +7,18 @@ pub(crate) struct LineTo {}
#[allow(non_upper_case_globals, missing_docs)]
#[doc = "Std lib function: lineTo"]
pub(crate) const LineTo: LineTo = LineTo {};
fn boxed_line_to(
args: crate::std::Args,
) -> std::pin::Pin<
Box<
dyn std::future::Future<
Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>,
>,
>,
> {
Box::pin(line_to(args))
}
impl crate::docs::StdLibFn for LineTo {
fn name(&self) -> String {
"lineTo".to_string()
@ -65,7 +77,7 @@ impl crate::docs::StdLibFn for LineTo {
}
fn std_lib_fn(&self) -> crate::std::StdFn {
line_to
boxed_line_to
}
fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> {

View File

@ -7,6 +7,18 @@ pub(crate) struct Min {}
#[allow(non_upper_case_globals, missing_docs)]
#[doc = "Std lib function: min"]
pub(crate) const Min: Min = Min {};
fn boxed_min(
args: crate::std::Args,
) -> std::pin::Pin<
Box<
dyn std::future::Future<
Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>,
>,
>,
> {
Box::pin(min(args))
}
impl crate::docs::StdLibFn for Min {
fn name(&self) -> String {
"min".to_string()
@ -57,7 +69,7 @@ impl crate::docs::StdLibFn for Min {
}
fn std_lib_fn(&self) -> crate::std::StdFn {
min
boxed_min
}
fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> {

View File

@ -7,6 +7,18 @@ pub(crate) struct Show {}
#[allow(non_upper_case_globals, missing_docs)]
#[doc = "Std lib function: show"]
pub(crate) const Show: Show = Show {};
fn boxed_show(
args: crate::std::Args,
) -> std::pin::Pin<
Box<
dyn std::future::Future<
Output = anyhow::Result<crate::executor::MemoryItem, crate::errors::KclError>,
>,
>,
> {
Box::pin(show(args))
}
impl crate::docs::StdLibFn for Show {
fn name(&self) -> String {
"show".to_string()
@ -52,7 +64,7 @@ impl crate::docs::StdLibFn for Show {
}
fn std_lib_fn(&self) -> crate::std::StdFn {
show
boxed_show
}
fn clone_box(&self) -> Box<dyn crate::docs::StdLibFn> {

View File

@ -1,7 +1,7 @@
[package]
name = "kcl-lib"
description = "KittyCAD Language"
version = "0.1.30"
version = "0.1.31"
edition = "2021"
license = "MIT"
@ -9,21 +9,22 @@ license = "MIT"
[dependencies]
anyhow = { version = "1.0.75", features = ["backtrace"] }
async-recursion = "1.0.5"
async-trait = "0.1.73"
clap = { version = "4.4.3", features = ["cargo", "derive", "env", "unicode"], optional = true }
dashmap = "5.5.3"
#derive-docs = { version = "0.1.4" }
derive-docs = { path = "../derive-docs" }
kittycad = { git = "https://github.com/KittyCAD/kittycad.rs", branch = "achalmers/relative-path-segments", default-features = false, features = ["js"] }
kittycad = { version = "0.2.25", default-features = false, features = ["js"] }
lazy_static = "1.4.0"
parse-display = "0.8.2"
regex = "1.7.1"
schemars = { version = "0.8", features = ["impl_json_schema", "url", "uuid1"] }
serde = { version = "1.0.188", features = ["derive"] }
serde_json = "1.0.107"
thiserror = "1.0.48"
ts-rs = { version = "7", package = "ts-rs-json-value", features = ["serde-json-impl", "schemars-impl", "uuid-impl"] }
uuid = { version = "1.4.1", features = ["v4", "js", "serde"] }
winnow = "0.5.15"
[target.'cfg(target_arch = "wasm32")'.dependencies]
js-sys = { version = "0.3.64" }
@ -50,7 +51,12 @@ panic = "abort"
debug = true
[dev-dependencies]
criterion = "0.5.1"
expectorate = "1.0.7"
itertools = "0.11.0"
pretty_assertions = "1.4.0"
tokio = { version = "1.32.0", features = ["rt-multi-thread", "macros", "time"] }
[[bench]]
name = "compiler_benchmark"
harness = false

View File

@ -0,0 +1,41 @@
use criterion::{black_box, criterion_group, criterion_main, Criterion};
pub fn bench_lex(c: &mut Criterion) {
c.bench_function("lex_cube", |b| b.iter(|| lex(CUBE_PROGRAM)));
c.bench_function("lex_big_kitt", |b| b.iter(|| lex(KITT_PROGRAM)));
c.bench_function("lex_pipes_on_pipes", |b| b.iter(|| lex(PIPES_PROGRAM)));
}
pub fn bench_lex_parse(c: &mut Criterion) {
c.bench_function("parse_lex_cube", |b| b.iter(|| lex_and_parse(CUBE_PROGRAM)));
c.bench_function("parse_lex_big_kitt", |b| b.iter(|| lex_and_parse(KITT_PROGRAM)));
c.bench_function("parse_lex_pipes_on_pipes", |b| b.iter(|| lex_and_parse(PIPES_PROGRAM)));
}
fn lex(program: &str) {
black_box(kcl_lib::token::lexer(program));
}
fn lex_and_parse(program: &str) {
let tokens = kcl_lib::token::lexer(program);
let parser = kcl_lib::parser::Parser::new(tokens);
black_box(parser.ast().unwrap());
}
criterion_group!(benches, bench_lex, bench_lex_parse);
criterion_main!(benches);
const KITT_PROGRAM: &str = include_str!("../../tests/executor/inputs/kittycad_svg.kcl");
const PIPES_PROGRAM: &str = include_str!("../../tests/executor/inputs/pipes_on_pipes.kcl");
const CUBE_PROGRAM: &str = r#"fn cube = (pos, scale) => {
const sg = startSketchAt(pos)
|> line([0, scale], %)
|> line([scale, 0], %)
|> line([0, -scale], %)
return sg
}
const b1 = cube([0,0], 10)
const pt1 = b1[0]
show(b1)"#;

View File

@ -0,0 +1 @@
enum-variant-size-threshold = 24

View File

@ -44,54 +44,6 @@ dependencies = [
"memchr",
]
[[package]]
name = "anstream"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1f58811cfac344940f1a400b6e6231ce35171f614f26439e80f8c1465c5cc0c"
dependencies = [
"anstyle",
"anstyle-parse",
"anstyle-query",
"anstyle-wincon",
"colorchoice",
"utf8parse",
]
[[package]]
name = "anstyle"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "15c4c2c83f81532e5845a733998b6971faca23490340a418e9b72a3ec9de12ea"
[[package]]
name = "anstyle-parse"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "938874ff5980b03a87c5524b3ae5b59cf99b1d6bc836848df7bc5ada9643c333"
dependencies = [
"utf8parse",
]
[[package]]
name = "anstyle-query"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5ca11d4be1bab0c8bc8734a9aa7bf4ee8316d462a08c6ac5052f888fef5b494b"
dependencies = [
"windows-sys",
]
[[package]]
name = "anstyle-wincon"
version = "2.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "58f54d10c6dfa51283a066ceab3ec1ab78d13fae00aa49243a45e4571fb79dfd"
dependencies = [
"anstyle",
"windows-sys",
]
[[package]]
name = "anyhow"
version = "1.0.75"
@ -123,6 +75,17 @@ dependencies = [
"thiserror",
]
[[package]]
name = "async-recursion"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5fd55a5ba1179988837d24ab4c7cc8ed6efdeff578ede0416b4225a5fca35bd0"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.37",
]
[[package]]
name = "async-trait"
version = "0.1.73"
@ -131,7 +94,7 @@ checksum = "bc00ceb34980c03614e35a3a4e218276a0a824e911d07651cd0d858a51e8c0f0"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -179,6 +142,20 @@ version = "0.21.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "414dcefbc63d77c526a76b3afcf6fbb9b5e2791c19c3aa2297733208750c6e53"
[[package]]
name = "bigdecimal"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "454bca3db10617b88b566f205ed190aedb0e0e6dd4cad61d3988a72e8c5594cb"
dependencies = [
"autocfg",
"libm",
"num-bigint",
"num-integer",
"num-traits",
"serde",
]
[[package]]
name = "bincode"
version = "1.3.3"
@ -284,54 +261,6 @@ dependencies = [
"serde",
]
[[package]]
name = "clap"
version = "4.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6a13b88d2c62ff462f88e4a121f17a82c1af05693a2f192b5c38d14de73c19f6"
dependencies = [
"clap_builder",
"clap_derive",
]
[[package]]
name = "clap_builder"
version = "4.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2bb9faaa7c2ef94b2743a21f5a29e6f0010dff4caa69ac8e9d6cf8b6fa74da08"
dependencies = [
"anstream",
"anstyle",
"clap_lex",
"strsim",
"unicase",
"unicode-width",
]
[[package]]
name = "clap_derive"
version = "4.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0862016ff20d69b84ef8247369fabf5c008a7417002411897d40ee1f4532b873"
dependencies = [
"heck",
"proc-macro2",
"quote",
"syn 2.0.31",
]
[[package]]
name = "clap_lex"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd7cc57abe963c6d3b9d8be5b06ba7c8957a930305ca90304f24ef040aa6f961"
[[package]]
name = "colorchoice"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "acbf1af155f9b9ef647e42cdc158db4b64a1b61f743629225fde6f3e0be2a7c7"
[[package]]
name = "convert_case"
version = "0.6.0"
@ -403,16 +332,14 @@ checksum = "f2696e8a945f658fd14dc3b87242e6b80cd0f36ff04ea560fa39082368847946"
[[package]]
name = "derive-docs"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5fe5c5ea065cfabc5a7c5e8ed616e369fbf108c4be01e0e5609bc9846a732664"
version = "0.1.4"
dependencies = [
"convert_case",
"proc-macro2",
"quote",
"serde",
"serde_tokenstream",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -529,7 +456,7 @@ checksum = "89ca545a94061b6365f2c7355b4b32bd20df3ff95f02da9329b34ccc3bd6ee72"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -769,11 +696,12 @@ dependencies = [
[[package]]
name = "kcl-lib"
version = "0.1.24"
version = "0.1.31"
dependencies = [
"anyhow",
"async-recursion",
"async-trait",
"bson",
"clap",
"dashmap",
"derive-docs",
"futures",
@ -781,7 +709,6 @@ dependencies = [
"kittycad",
"lazy_static",
"parse-display",
"regex",
"reqwest",
"schemars",
"serde",
@ -794,6 +721,8 @@ dependencies = [
"uuid",
"wasm-bindgen",
"wasm-bindgen-futures",
"web-sys",
"winnow",
]
[[package]]
@ -806,12 +735,13 @@ dependencies = [
[[package]]
name = "kittycad"
version = "0.2.24"
version = "0.2.26"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd2f78e95054c83ab77059fe0237b128e2b241b4b8e9466e452d701c2599f3d3"
checksum = "e2623ee601ce203476229df3f9d3a14664cb43e3f7455e9ac8ed91aacaa6163d"
dependencies = [
"anyhow",
"base64 0.21.3",
"bigdecimal",
"bytes",
"chrono",
"data-encoding",
@ -850,6 +780,12 @@ dependencies = [
"once_cell",
]
[[package]]
name = "libm"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7012b1bbb0719e1097c47611d3898568c546d597c2e74d66f6087edd5233ff4"
[[package]]
name = "linked-hash-map"
version = "0.5.6"
@ -942,6 +878,27 @@ dependencies = [
"minimal-lexical",
]
[[package]]
name = "num-bigint"
version = "0.4.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "608e7659b5c3d7cba262d894801b9ec9d00de989e8a82bd4bef91d08da45cdc0"
dependencies = [
"autocfg",
"num-integer",
"num-traits",
]
[[package]]
name = "num-integer"
version = "0.1.45"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "225d3389fb3509a24c93f5c29eb6bde2586b98d9f016636dff58d7c6f7569cd9"
dependencies = [
"autocfg",
"num-traits",
]
[[package]]
name = "num-traits"
version = "0.2.16"
@ -1034,7 +991,7 @@ dependencies = [
"regex",
"regex-syntax 0.7.5",
"structmeta",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1045,9 +1002,9 @@ checksum = "9b2a4787296e9989611394c33f193f676704af1686e70b8f8033ab5ba9a35a94"
[[package]]
name = "phonenumber"
version = "0.3.2+8.13.9"
version = "0.3.3+8.13.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34749f64ea9d76f10cdc8a859588b57775f59177c7dd91f744d620bd62982d6f"
checksum = "635f3e6288e4f01c049d89332a031bd74f25d64b6fb94703ca966e819488cd06"
dependencies = [
"bincode",
"either",
@ -1060,6 +1017,7 @@ dependencies = [
"regex-cache",
"serde",
"serde_derive",
"strum",
"thiserror",
]
@ -1080,7 +1038,7 @@ checksum = "4359fd9c9171ec6e8c62926d6faaf553a8dc3f64e1507e76da7911b4f6a04405"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1127,9 +1085,9 @@ dependencies = [
[[package]]
name = "proc-macro2"
version = "1.0.66"
version = "1.0.67"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "18fb31db3f9bddb2ea821cde30a9f70117e3f119938b5ee630b7403aa6e2ead9"
checksum = "3d433d9f1a3e8c1263d9456598b16fec66f4acc9a74dacffd35c7bb09b3a1328"
dependencies = [
"unicode-ident",
]
@ -1342,6 +1300,12 @@ dependencies = [
"untrusted",
]
[[package]]
name = "rustversion"
version = "1.0.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7ffc183a10b4478d04cbbbfc96d0873219d962dd5accaff2ffbd4ceb7df837f4"
[[package]]
name = "ryu"
version = "1.0.15"
@ -1359,10 +1323,11 @@ dependencies = [
[[package]]
name = "schemars"
version = "0.8.13"
version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "763f8cd0d4c71ed8389c90cb8100cba87e763bd01a8e614d4f0af97bcd50a161"
checksum = "1f7b0ce13155372a76ee2e1c5ffba1fe61ede73fbea5630d61eee6fac4929c0c"
dependencies = [
"bigdecimal",
"bytes",
"chrono",
"dyn-clone",
@ -1375,9 +1340,9 @@ dependencies = [
[[package]]
name = "schemars_derive"
version = "0.8.13"
version = "0.8.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ec0f696e21e10fa546b7ffb1c9672c6de8fbc7a81acf59524386d8639bf12737"
checksum = "e85e2a16b12bdb763244c69ab79363d71db2b4b918a2def53f80b02e0574b13c"
dependencies = [
"proc-macro2",
"quote",
@ -1450,7 +1415,7 @@ checksum = "4eca7ac642d82aa35b60049a6eccb4be6be75e599bd2e9adb5f875a737654af2"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1466,9 +1431,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.105"
version = "1.0.107"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "693151e1ac27563d6dbcec9dee9fbd5da8539b20fa14ad3752b2e6d363ace360"
checksum = "6b420ce6e3d8bd882e9b243c6eed35dbc9a6110c9769e74b584e0d68d1f20c65"
dependencies = [
"indexmap 2.0.0",
"itoa",
@ -1484,7 +1449,7 @@ checksum = "8725e1dfadb3a50f7e5ce0b1a540466f6ed3fe7a0fca2ac2b8b831d31316bd00"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1496,7 +1461,7 @@ dependencies = [
"proc-macro2",
"quote",
"serde",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1572,12 +1537,6 @@ version = "0.5.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6e63cff320ae2c57904679ba7cb63280a3dc4613885beafb148ee7bf9aa9042d"
[[package]]
name = "strsim"
version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "73473c0e59e6d5812c5dfe2a064a6444949f089e20eec9a2e5506596494e4623"
[[package]]
name = "structmeta"
version = "0.2.0"
@ -1587,7 +1546,7 @@ dependencies = [
"proc-macro2",
"quote",
"structmeta-derive",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1598,7 +1557,29 @@ checksum = "a60bcaff7397072dca0017d1db428e30d5002e00b6847703e2e42005c95fbe00"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
name = "strum"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "063e6045c0e62079840579a7e47a355ae92f60eb74daaf156fb1e84ba164e63f"
dependencies = [
"strum_macros",
]
[[package]]
name = "strum_macros"
version = "0.24.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e385be0d24f186b4ce2f9982191e7101bb737312ad61c1f2f984f34bcf85d59"
dependencies = [
"heck",
"proc-macro2",
"quote",
"rustversion",
"syn 1.0.109",
]
[[package]]
@ -1614,9 +1595,9 @@ dependencies = [
[[package]]
name = "syn"
version = "2.0.31"
version = "2.0.37"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "718fa2415bcb8d8bd775917a1bf12a7931b6dfa890753378538118181e0cb398"
checksum = "7303ef2c05cd654186cb250d29049a24840ca25d2747c25c0381c8d9e2f582e8"
dependencies = [
"proc-macro2",
"quote",
@ -1655,7 +1636,7 @@ checksum = "49922ecae66cc8a249b77e68d1d0623c1b2c514f0060c27cdc68bd62a1219d35"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1728,7 +1709,7 @@ checksum = "630bdcf245f78637c13ec01ffae6187cca34625e8c63150d424b59e55af2675e"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1822,7 +1803,7 @@ checksum = "84fd902d4e0b9a4b27f2f440108dc034e1758628a9b702f8ec61ad66355422fa"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1851,7 +1832,7 @@ checksum = "5f4f31f56159e98206da9efd823404b79b6ef3143b4a7ab76e67b1751b25a4ab"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
]
[[package]]
@ -1891,15 +1872,15 @@ dependencies = [
"Inflector",
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
"termcolor",
]
[[package]]
name = "tungstenite"
version = "0.20.0"
version = "0.20.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e862a1c4128df0112ab625f55cd5c934bcb4312ba80b39ae4b4835a3fd58e649"
checksum = "9e3dac10fd62eaf6617d3a904ae222845979aec67c615d1c842b4002c7666fb9"
dependencies = [
"byteorder",
"bytes",
@ -1921,15 +1902,6 @@ version = "1.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "497961ef93d974e23eb6f433eb5fe1b7930b659f06d12dec6fc44a8f554c0bba"
[[package]]
name = "unicase"
version = "2.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7d2d4dafb69621809a81864c9c1b864479e1235c0dd4e199924b9742439ed89"
dependencies = [
"version_check",
]
[[package]]
name = "unicode-bidi"
version = "0.3.13"
@ -1957,12 +1929,6 @@ version = "1.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1dd624098567895118886609431a7c3b8f516e41d30e0643f03d94592a147e36"
[[package]]
name = "unicode-width"
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c0edd1e5b14653f783770bce4a4dabb4a5108a5370a5f5d8cfe8710c361f6c8b"
[[package]]
name = "untrusted"
version = "0.7.1"
@ -1987,12 +1953,6 @@ version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09cc8ee72d2a9becf2f2febe0205bbed8fc6615b7cb429ad062dc7b7ddd036a9"
[[package]]
name = "utf8parse"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "711b9620af191e0cdc7468a8d14e709c3dcdb115b36f838e601583af800a370a"
[[package]]
name = "uuid"
version = "1.4.1"
@ -2046,7 +2006,7 @@ dependencies = [
"once_cell",
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
"wasm-bindgen-shared",
]
@ -2080,7 +2040,7 @@ checksum = "54681b18a46765f095758388f2d0cf16eb8d4169b639ab575a8f5693af210c7b"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.31",
"syn 2.0.37",
"wasm-bindgen-backend",
"wasm-bindgen-shared",
]
@ -2198,6 +2158,15 @@ version = "0.48.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538"
[[package]]
name = "winnow"
version = "0.5.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c2e3184b9c4e92ad5167ca73039d0c42476302ab603e2fec4487511f38ccefc"
dependencies = [
"memchr",
]
[[package]]
name = "winreg"
version = "0.50.0"

View File

@ -71,7 +71,7 @@ pub async fn modify_ast_for_sketch(
// Let's get the path info.
let resp = engine
.send_modeling_cmd_get_response(
.send_modeling_cmd(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::PathGetInfo { path_id: sketch_id },
@ -88,47 +88,6 @@ pub async fn modify_ast_for_sketch(
}));
};
/* // Let's try to get the children of the sketch.
let resp = engine
.send_modeling_cmd_get_response(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::EntityGetAllChildUuids { entity_id: sketch_id },
)
.await?;
let kittycad::types::OkWebSocketResponseData::Modeling {
modeling_response: kittycad::types::OkModelingCmdResponse::EntityGetAllChildUuids { data: children_info },
} = &resp
else {
return Err(KclError::Engine(KclErrorDetails {
message: format!("Get child info response was not as expected: {:?}", resp),
source_ranges: vec![SourceRange::default()],
}));
};
println!("children_info: {:#?}", children_info);
// Let's try to get the parent id.
let resp = engine
.send_modeling_cmd_get_response(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::EntityGetParentId { entity_id: sketch_id },
)
.await?;
let kittycad::types::OkWebSocketResponseData::Modeling {
modeling_response: kittycad::types::OkModelingCmdResponse::EntityGetParentId { data: parent_info },
} = &resp
else {
return Err(KclError::Engine(KclErrorDetails {
message: format!("Get parent id response was not as expected: {:?}", resp),
source_ranges: vec![SourceRange::default()],
}));
};
println!("parent_info: {:#?}", parent_info);*/
// Now let's get the control points for all the segments.
// TODO: We should probably await all these at once so we aren't going one by one.
// But I guess this is fine for now.
@ -136,7 +95,7 @@ pub async fn modify_ast_for_sketch(
let mut control_points = Vec::new();
for segment in &path_info.segments {
if let Some(command_id) = &segment.command_id {
let h = engine.send_modeling_cmd_get_response(
let h = engine.send_modeling_cmd(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::CurveGetControlPoints { curve_id: *command_id },
@ -207,7 +166,7 @@ pub async fn modify_ast_for_sketch(
let recasted = program.recast(&FormatOptions::default(), 0);
// Re-parse the ast so we get the correct source ranges.
let tokens = crate::tokeniser::lexer(&recasted);
let tokens = crate::token::lexer(&recasted);
let parser = crate::parser::Parser::new(tokens);
*program = parser.ast()?;

View File

@ -571,11 +571,12 @@ impl BinaryPart {
}
}
pub fn get_result(
#[async_recursion::async_recursion(?Send)]
pub async fn get_result(
&self,
memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<MemoryItem, KclError> {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would
// stop the execution of the pipe.
@ -590,11 +591,13 @@ impl BinaryPart {
Ok(value.clone())
}
BinaryPart::BinaryExpression(binary_expression) => {
binary_expression.get_result(memory, &mut new_pipe_info, engine)
binary_expression.get_result(memory, &mut new_pipe_info, engine).await
}
BinaryPart::CallExpression(call_expression) => {
call_expression.execute(memory, &mut new_pipe_info, engine).await
}
BinaryPart::CallExpression(call_expression) => call_expression.execute(memory, &mut new_pipe_info, engine),
BinaryPart::UnaryExpression(unary_expression) => {
unary_expression.get_result(memory, &mut new_pipe_info, engine)
unary_expression.get_result(memory, &mut new_pipe_info, engine).await
}
BinaryPart::MemberExpression(member_expression) => member_expression.get_result(memory),
}
@ -810,11 +813,12 @@ impl CallExpression {
)
}
pub fn execute(
#[async_recursion::async_recursion(?Send)]
pub async fn execute(
&self,
memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<MemoryItem, KclError> {
let fn_name = self.callee.name.clone();
@ -828,7 +832,7 @@ impl CallExpression {
value.clone()
}
Value::BinaryExpression(binary_expression) => {
binary_expression.get_result(memory, pipe_info, engine)?
binary_expression.get_result(memory, pipe_info, engine).await?
}
Value::CallExpression(call_expression) => {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would
@ -836,11 +840,15 @@ impl CallExpression {
// THIS IS IMPORTANT.
let mut new_pipe_info = pipe_info.clone();
new_pipe_info.is_in_pipe = false;
call_expression.execute(memory, &mut new_pipe_info, engine)?
call_expression.execute(memory, &mut new_pipe_info, engine).await?
}
Value::UnaryExpression(unary_expression) => unary_expression.get_result(memory, pipe_info, engine)?,
Value::ObjectExpression(object_expression) => object_expression.execute(memory, pipe_info, engine)?,
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine)?,
Value::UnaryExpression(unary_expression) => {
unary_expression.get_result(memory, pipe_info, engine).await?
}
Value::ObjectExpression(object_expression) => {
object_expression.execute(memory, pipe_info, engine).await?
}
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine).await?,
Value::PipeExpression(pipe_expression) => {
return Err(KclError::Semantic(KclErrorDetails {
message: format!("PipeExpression not implemented here: {:?}", pipe_expression),
@ -871,31 +879,28 @@ impl CallExpression {
match &self.function {
Function::StdLib { func } => {
/* let source_range: SourceRange = self.into();
println!(
"Calling stdlib function: {}, source_range: {:?}, args: {:?}",
fn_name, source_range, fn_args
);*/
// Attempt to call the function.
let mut args = crate::std::Args::new(fn_args, self.into(), engine);
let result = func.std_lib_fn()(&mut args)?;
let args = crate::std::Args::new(fn_args, self.into(), engine.clone());
let result = func.std_lib_fn()(args).await?;
if pipe_info.is_in_pipe {
pipe_info.index += 1;
pipe_info.previous_results.push(result);
execute_pipe_body(memory, &pipe_info.body.clone(), pipe_info, self.into(), engine)
execute_pipe_body(memory, &pipe_info.body.clone(), pipe_info, self.into(), engine).await
} else {
Ok(result)
}
}
Function::InMemory => {
let mem = memory.clone();
let func = mem.get(&fn_name, self.into())?;
let result = func.call_fn(&fn_args, &mem, engine)?.ok_or_else(|| {
KclError::UndefinedValue(KclErrorDetails {
message: format!("Result of function {} is undefined", fn_name),
source_ranges: vec![self.into()],
})
})?;
let func = memory.get(&fn_name, self.into())?;
let result = func
.call_fn(fn_args, memory.clone(), engine.clone())
.await?
.ok_or_else(|| {
KclError::UndefinedValue(KclErrorDetails {
message: format!("Result of function {} is undefined", fn_name),
source_ranges: vec![self.into()],
})
})?;
let result = result.get_value()?;
@ -903,7 +908,7 @@ impl CallExpression {
pipe_info.index += 1;
pipe_info.previous_results.push(result);
execute_pipe_body(memory, &pipe_info.body.clone(), pipe_info, self.into(), engine)
execute_pipe_body(memory, &pipe_info.body.clone(), pipe_info, self.into(), engine).await
} else {
Ok(result)
}
@ -1424,11 +1429,12 @@ impl ArrayExpression {
None
}
pub fn execute(
#[async_recursion::async_recursion(?Send)]
pub async fn execute(
&self,
memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<MemoryItem, KclError> {
let mut results = Vec::with_capacity(self.elements.len());
@ -1440,7 +1446,7 @@ impl ArrayExpression {
value.clone()
}
Value::BinaryExpression(binary_expression) => {
binary_expression.get_result(memory, pipe_info, engine)?
binary_expression.get_result(memory, pipe_info, engine).await?
}
Value::CallExpression(call_expression) => {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would
@ -1448,12 +1454,16 @@ impl ArrayExpression {
// THIS IS IMPORTANT.
let mut new_pipe_info = pipe_info.clone();
new_pipe_info.is_in_pipe = false;
call_expression.execute(memory, &mut new_pipe_info, engine)?
call_expression.execute(memory, &mut new_pipe_info, engine).await?
}
Value::UnaryExpression(unary_expression) => unary_expression.get_result(memory, pipe_info, engine)?,
Value::ObjectExpression(object_expression) => object_expression.execute(memory, pipe_info, engine)?,
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine)?,
Value::PipeExpression(pipe_expression) => pipe_expression.get_result(memory, pipe_info, engine)?,
Value::UnaryExpression(unary_expression) => {
unary_expression.get_result(memory, pipe_info, engine).await?
}
Value::ObjectExpression(object_expression) => {
object_expression.execute(memory, pipe_info, engine).await?
}
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine).await?,
Value::PipeExpression(pipe_expression) => pipe_expression.get_result(memory, pipe_info, engine).await?,
Value::PipeSubstitution(pipe_substitution) => {
return Err(KclError::Semantic(KclErrorDetails {
message: format!("PipeSubstitution not implemented here: {:?}", pipe_substitution),
@ -1569,11 +1579,12 @@ impl ObjectExpression {
None
}
pub fn execute(
#[async_recursion::async_recursion(?Send)]
pub async fn execute(
&self,
memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<MemoryItem, KclError> {
let mut object = Map::new();
for property in &self.properties {
@ -1584,7 +1595,7 @@ impl ObjectExpression {
value.clone()
}
Value::BinaryExpression(binary_expression) => {
binary_expression.get_result(memory, pipe_info, engine)?
binary_expression.get_result(memory, pipe_info, engine).await?
}
Value::CallExpression(call_expression) => {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would
@ -1592,12 +1603,16 @@ impl ObjectExpression {
// THIS IS IMPORTANT.
let mut new_pipe_info = pipe_info.clone();
new_pipe_info.is_in_pipe = false;
call_expression.execute(memory, &mut new_pipe_info, engine)?
call_expression.execute(memory, &mut new_pipe_info, engine).await?
}
Value::UnaryExpression(unary_expression) => unary_expression.get_result(memory, pipe_info, engine)?,
Value::ObjectExpression(object_expression) => object_expression.execute(memory, pipe_info, engine)?,
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine)?,
Value::PipeExpression(pipe_expression) => pipe_expression.get_result(memory, pipe_info, engine)?,
Value::UnaryExpression(unary_expression) => {
unary_expression.get_result(memory, pipe_info, engine).await?
}
Value::ObjectExpression(object_expression) => {
object_expression.execute(memory, pipe_info, engine).await?
}
Value::ArrayExpression(array_expression) => array_expression.execute(memory, pipe_info, engine).await?,
Value::PipeExpression(pipe_expression) => pipe_expression.get_result(memory, pipe_info, engine).await?,
Value::PipeSubstitution(pipe_substitution) => {
return Err(KclError::Semantic(KclErrorDetails {
message: format!("PipeSubstitution not implemented here: {:?}", pipe_substitution),
@ -2005,11 +2020,12 @@ impl BinaryExpression {
None
}
pub fn get_result(
#[async_recursion::async_recursion(?Send)]
pub async fn get_result(
&self,
memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<MemoryItem, KclError> {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would
// stop the execution of the pipe.
@ -2019,11 +2035,13 @@ impl BinaryExpression {
let left_json_value = self
.left
.get_result(memory, &mut new_pipe_info, engine)?
.get_result(memory, &mut new_pipe_info, engine)
.await?
.get_json_value()?;
let right_json_value = self
.right
.get_result(memory, &mut new_pipe_info, engine)?
.get_result(memory, &mut new_pipe_info, engine)
.await?
.get_json_value()?;
// First check if we are doing string concatenation.
@ -2173,11 +2191,11 @@ impl UnaryExpression {
format!("{}{}", &self.operator, self.argument.recast(options, 0))
}
pub fn get_result(
pub async fn get_result(
&self,
memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<MemoryItem, KclError> {
// We DO NOT set this gloablly because if we did and this was called inside a pipe it would
// stop the execution of the pipe.
@ -2188,7 +2206,8 @@ impl UnaryExpression {
let num = parse_json_number_as_f64(
&self
.argument
.get_result(memory, &mut new_pipe_info, engine)?
.get_result(memory, &mut new_pipe_info, engine)
.await?
.get_json_value()?,
self.into(),
)?;
@ -2310,16 +2329,16 @@ impl PipeExpression {
None
}
pub fn get_result(
pub async fn get_result(
&self,
memory: &mut ProgramMemory,
pipe_info: &mut PipeInfo,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<MemoryItem, KclError> {
// Reset the previous results.
pipe_info.previous_results = vec![];
pipe_info.index = 0;
execute_pipe_body(memory, &self.body, pipe_info, self.into(), engine)
execute_pipe_body(memory, &self.body, pipe_info, self.into(), engine).await
}
/// Rename all identifiers that have the old name to the new given name.
@ -2330,12 +2349,13 @@ impl PipeExpression {
}
}
fn execute_pipe_body(
#[async_recursion::async_recursion(?Send)]
async fn execute_pipe_body(
memory: &mut ProgramMemory,
body: &[Value],
pipe_info: &mut PipeInfo,
source_range: SourceRange,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<MemoryItem, KclError> {
if pipe_info.index == body.len() {
pipe_info.is_in_pipe = false;
@ -2360,15 +2380,15 @@ fn execute_pipe_body(
match expression {
Value::BinaryExpression(binary_expression) => {
let result = binary_expression.get_result(memory, pipe_info, engine)?;
let result = binary_expression.get_result(memory, pipe_info, engine).await?;
pipe_info.previous_results.push(result);
pipe_info.index += 1;
execute_pipe_body(memory, body, pipe_info, source_range, engine)
execute_pipe_body(memory, body, pipe_info, source_range, engine).await
}
Value::CallExpression(call_expression) => {
pipe_info.is_in_pipe = true;
pipe_info.body = body.to_vec();
call_expression.execute(memory, pipe_info, engine)
call_expression.execute(memory, pipe_info, engine).await
}
_ => {
// Return an error this should not happen.
@ -2671,7 +2691,7 @@ fn ghi = (x) => {
}
show(part001)"#;
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
let symbols = program.get_lsp_symbols(code);
@ -2699,7 +2719,7 @@ show(part001)
let some_program_string = r#"const part001 = startSketchAt([0.0, 5.0])
|> line([0.4900857016, -0.0240763666], %)
|> line([0.6804562304, 0.9087880491], %)"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2718,7 +2738,7 @@ show(part001)
let some_program_string = r#"const part001 = startSketchAt([0.0, 5.0])
|> line([0.4900857016, -0.0240763666], %) // hello world
|> line([0.6804562304, 0.9087880491], %)"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2737,7 +2757,7 @@ show(part001)
|> line([0.4900857016, -0.0240763666], %)
// hello world
|> line([0.6804562304, 0.9087880491], %)"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2763,7 +2783,7 @@ show(part001)
// this is also a comment
return things
}"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2800,7 +2820,7 @@ const mySk1 = startSketchAt([0, 0])
|> ry(45, %)
|> rx(45, %)
// one more for good measure"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2839,7 +2859,7 @@ a comment between pipe expression statements */
|> line([-0.42, -1.72], %)
show(part001)"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2865,7 +2885,7 @@ const yo = [
" hey oooooo really long long long"
]
"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2883,7 +2903,7 @@ const key = 'c'
const things = "things"
// this is also a comment"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2901,7 +2921,7 @@ const things = "things"
// a comment
"
}"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2926,7 +2946,7 @@ const part001 = startSketchAt([0, 0])
-angleToMatchLengthY('seg01', myVar, %),
myVar
], %) // ln-lineTo-yAbsolute should use angleToMatchLengthY helper"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2952,7 +2972,7 @@ const part001 = startSketchAt([0, 0])
myVar
], %) // ln-lineTo-yAbsolute should use angleToMatchLengthY helper
"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -2983,7 +3003,7 @@ fn ghi = (part001) => {
}
show(part001)"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let mut program = parser.ast().unwrap();
program.rename_symbol("mySuperCoolPart", 6);
@ -3014,7 +3034,7 @@ show(mySuperCoolPart)
let some_program_string = r#"fn ghi = (x, y, z) => {
return x
}"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let mut program = parser.ast().unwrap();
program.rename_symbol("newName", 10);
@ -3043,7 +3063,7 @@ const firstExtrude = startSketchAt([0,0])
|> extrude(h, %)
show(firstExtrude)"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -3069,7 +3089,7 @@ show(firstExtrude)
#[tokio::test(flavor = "multi_thread")]
async fn test_recast_math_start_negative() {
let some_program_string = r#"const myVar = -5 + 6"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();
@ -3085,7 +3105,7 @@ const FOS = 2
const sigmaAllow = 8
const width = 20
const thickness = sqrt(distance * p * FOS * 6 / (sigmaAllow * width))"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let tokens = crate::token::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast().unwrap();

View File

@ -3,10 +3,11 @@
use std::sync::Arc;
use anyhow::Result;
use anyhow::{anyhow, Result};
use dashmap::DashMap;
use futures::{SinkExt, StreamExt};
use kittycad::types::{OkWebSocketResponseData, WebSocketRequest, WebSocketResponse};
use tokio::sync::{mpsc, oneshot};
use tokio_tungstenite::tungstenite::Message as WsMsg;
use crate::{
@ -14,18 +15,13 @@ use crate::{
errors::{KclError, KclErrorDetails},
};
#[derive(Debug)]
type WebSocketTcpWrite = futures::stream::SplitSink<tokio_tungstenite::WebSocketStream<reqwest::Upgraded>, WsMsg>;
#[derive(Debug, Clone)]
#[allow(dead_code)] // for the TcpReadHandle
pub struct EngineConnection {
tcp_write: futures::stream::SplitSink<tokio_tungstenite::WebSocketStream<reqwest::Upgraded>, WsMsg>,
tcp_read_handle: tokio::task::JoinHandle<Result<()>>,
engine_req_tx: mpsc::Sender<ToEngineReq>,
responses: Arc<DashMap<uuid::Uuid, WebSocketResponse>>,
}
impl Drop for EngineConnection {
fn drop(&mut self) {
// Drop the read handle.
self.tcp_read_handle.abort();
}
tcp_read_handle: Arc<TcpReadHandle>,
}
pub struct TcpRead {
@ -46,16 +42,63 @@ impl TcpRead {
}
}
#[derive(Debug)]
pub struct TcpReadHandle {
handle: Arc<tokio::task::JoinHandle<Result<()>>>,
}
impl Drop for TcpReadHandle {
fn drop(&mut self) {
// Drop the read handle.
self.handle.abort();
}
}
/// Requests to send to the engine, and a way to await a response.
struct ToEngineReq {
/// The request to send
req: WebSocketRequest,
/// If this resolves to Ok, the request was sent.
/// If this resolves to Err, the request could not be sent.
/// If this has not yet resolved, the request has not been sent yet.
request_sent: oneshot::Sender<Result<()>>,
}
impl EngineConnection {
/// Start waiting for incoming engine requests, and send each one over the WebSocket to the engine.
async fn start_write_actor(mut tcp_write: WebSocketTcpWrite, mut engine_req_rx: mpsc::Receiver<ToEngineReq>) {
while let Some(req) = engine_req_rx.recv().await {
let ToEngineReq { req, request_sent } = req;
let res = Self::inner_send_to_engine(req, &mut tcp_write).await;
let _ = request_sent.send(res);
}
}
/// Send the given `request` to the engine via the WebSocket connection `tcp_write`.
async fn inner_send_to_engine(request: WebSocketRequest, tcp_write: &mut WebSocketTcpWrite) -> Result<()> {
let msg = serde_json::to_string(&request).map_err(|e| anyhow!("could not serialize json: {e}"))?;
tcp_write
.send(WsMsg::Text(msg))
.await
.map_err(|e| anyhow!("could not send json over websocket: {e}"))?;
Ok(())
}
pub async fn new(ws: reqwest::Upgraded) -> Result<EngineConnection> {
let ws_stream = tokio_tungstenite::WebSocketStream::from_raw_socket(
ws,
tokio_tungstenite::tungstenite::protocol::Role::Client,
None,
Some(tokio_tungstenite::tungstenite::protocol::WebSocketConfig {
write_buffer_size: 1024 * 128,
max_write_buffer_size: 1024 * 256,
..Default::default()
}),
)
.await;
let (tcp_write, tcp_read) = ws_stream.split();
let (engine_req_tx, engine_req_rx) = mpsc::channel(10);
tokio::task::spawn(Self::start_write_actor(tcp_write, engine_req_rx));
let mut tcp_read = TcpRead { stream: tcp_read };
@ -80,42 +123,34 @@ impl EngineConnection {
});
Ok(EngineConnection {
tcp_write,
tcp_read_handle,
engine_req_tx,
tcp_read_handle: Arc::new(TcpReadHandle {
handle: Arc::new(tcp_read_handle),
}),
responses,
})
}
pub async fn tcp_send(&mut self, msg: WebSocketRequest) -> Result<()> {
let msg = serde_json::to_string(&msg)?;
self.tcp_write.send(WsMsg::Text(msg)).await?;
Ok(())
}
}
#[async_trait::async_trait(?Send)]
impl EngineManager for EngineConnection {
/// Send a modeling command.
/// Do not wait for the response message.
fn send_modeling_cmd(
&mut self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,
) -> Result<(), KclError> {
futures::executor::block_on(self.send_modeling_cmd_get_response(id, source_range, cmd))?;
Ok(())
}
/// Send a modeling command and wait for the response message.
async fn send_modeling_cmd_get_response(
&mut self,
async fn send_modeling_cmd(
&self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,
) -> Result<OkWebSocketResponseData, KclError> {
self.tcp_send(WebSocketRequest::ModelingCmdReq { cmd, cmd_id: id })
let (tx, rx) = oneshot::channel();
// Send the request to the engine, via the actor.
self.engine_req_tx
.send(ToEngineReq {
req: WebSocketRequest::ModelingCmdReq {
cmd: cmd.clone(),
cmd_id: id,
},
request_sent: tx,
})
.await
.map_err(|e| {
KclError::Engine(KclErrorDetails {
@ -124,18 +159,40 @@ impl EngineManager for EngineConnection {
})
})?;
// Wait for the request to be sent.
rx.await
.map_err(|e| {
KclError::Engine(KclErrorDetails {
message: format!("could not send request to the engine actor: {e}"),
source_ranges: vec![source_range],
})
})?
.map_err(|e| {
KclError::Engine(KclErrorDetails {
message: format!("could not send request to the engine: {e}"),
source_ranges: vec![source_range],
})
})?;
// Wait for the response.
loop {
if let Some(resp) = self.responses.get(&id) {
if let Some(data) = &resp.resp {
return Ok(data.clone());
let current_time = std::time::Instant::now();
while current_time.elapsed().as_secs() < 60 {
// We pop off the responses to cleanup our mappings.
if let Some((_, resp)) = self.responses.remove(&id) {
return if let Some(data) = &resp.resp {
Ok(data.clone())
} else {
return Err(KclError::Engine(KclErrorDetails {
Err(KclError::Engine(KclErrorDetails {
message: format!("Modeling command failed: {:?}", resp.errors),
source_ranges: vec![source_range],
}));
}
}))
};
}
}
Err(KclError::Engine(KclErrorDetails {
message: format!("Modeling command timed out `{}`: {:?}", id, cmd),
source_ranges: vec![source_range],
}))
}
}

View File

@ -6,7 +6,7 @@ use kittycad::types::OkWebSocketResponseData;
use crate::errors::KclError;
#[derive(Debug)]
#[derive(Debug, Clone)]
pub struct EngineConnection {}
impl EngineConnection {
@ -17,21 +17,14 @@ impl EngineConnection {
#[async_trait::async_trait(?Send)]
impl crate::engine::EngineManager for EngineConnection {
fn send_modeling_cmd(
&mut self,
_id: uuid::Uuid,
_source_range: crate::executor::SourceRange,
_cmd: kittycad::types::ModelingCmd,
) -> Result<(), KclError> {
Ok(())
}
async fn send_modeling_cmd_get_response(
&mut self,
async fn send_modeling_cmd(
&self,
_id: uuid::Uuid,
_source_range: crate::executor::SourceRange,
_cmd: kittycad::types::ModelingCmd,
) -> Result<OkWebSocketResponseData, KclError> {
todo!()
Ok(OkWebSocketResponseData::Modeling {
modeling_response: kittycad::types::OkModelingCmdResponse::Empty {},
})
}
}

View File

@ -1,5 +1,6 @@
//! Functions for setting up our WebSocket and WebRTC connections for communications with the
//! engine.
use std::sync::Arc;
use anyhow::Result;
use kittycad::types::WebSocketRequest;
@ -23,44 +24,21 @@ extern "C" {
#[derive(Debug, Clone)]
pub struct EngineConnection {
manager: EngineCommandManager,
manager: Arc<EngineCommandManager>,
}
impl EngineConnection {
pub async fn new(manager: EngineCommandManager) -> Result<EngineConnection, JsValue> {
Ok(EngineConnection { manager })
Ok(EngineConnection {
manager: Arc::new(manager),
})
}
}
#[async_trait::async_trait(?Send)]
impl crate::engine::EngineManager for EngineConnection {
fn send_modeling_cmd(
&mut self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,
) -> Result<(), KclError> {
let source_range_str = serde_json::to_string(&source_range).map_err(|e| {
KclError::Engine(KclErrorDetails {
message: format!("Failed to serialize source range: {:?}", e),
source_ranges: vec![source_range],
})
})?;
let ws_msg = WebSocketRequest::ModelingCmdReq { cmd, cmd_id: id };
let cmd_str = serde_json::to_string(&ws_msg).map_err(|e| {
KclError::Engine(KclErrorDetails {
message: format!("Failed to serialize modeling command: {:?}", e),
source_ranges: vec![source_range],
})
})?;
let _ = self
.manager
.sendModelingCommandFromWasm(id.to_string(), source_range_str, cmd_str);
Ok(())
}
async fn send_modeling_cmd_get_response(
&mut self,
async fn send_modeling_cmd(
&self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,

View File

@ -32,19 +32,10 @@ use anyhow::Result;
pub use conn_mock::EngineConnection;
#[async_trait::async_trait(?Send)]
pub trait EngineManager {
/// Send a modeling command.
/// Do not wait for the response message.
fn send_modeling_cmd(
&mut self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,
) -> Result<(), crate::errors::KclError>;
pub trait EngineManager: Clone {
/// Send a modeling command and wait for the response message.
async fn send_modeling_cmd_get_response(
&mut self,
async fn send_modeling_cmd(
&self,
id: uuid::Uuid,
source_range: crate::executor::SourceRange,
cmd: kittycad::types::ModelingCmd,

View File

@ -104,7 +104,7 @@ pub enum MemoryItem {
SketchGroup(Box<SketchGroup>),
ExtrudeGroup(Box<ExtrudeGroup>),
#[ts(skip)]
ExtrudeTransform(ExtrudeTransform),
ExtrudeTransform(Box<ExtrudeTransform>),
#[ts(skip)]
Function {
#[serde(skip)]
@ -134,13 +134,28 @@ pub struct ExtrudeTransform {
pub meta: Vec<Metadata>,
}
pub type MemoryFunction = fn(
s: &[MemoryItem],
memory: &ProgramMemory,
expression: &FunctionExpression,
metadata: &[Metadata],
engine: &mut EngineConnection,
) -> Result<Option<ProgramReturn>, KclError>;
pub type MemoryFunction =
fn(
s: Vec<MemoryItem>,
memory: ProgramMemory,
expression: Box<FunctionExpression>,
metadata: Vec<Metadata>,
engine: EngineConnection,
) -> std::pin::Pin<Box<dyn std::future::Future<Output = Result<Option<ProgramReturn>, KclError>>>>;
fn force_memory_function<
F: Fn(
Vec<MemoryItem>,
ProgramMemory,
Box<FunctionExpression>,
Vec<Metadata>,
EngineConnection,
) -> std::pin::Pin<Box<dyn std::future::Future<Output = Result<Option<ProgramReturn>, KclError>>>>,
>(
f: F,
) -> F {
f
}
impl From<MemoryItem> for Vec<SourceRange> {
fn from(item: MemoryItem) -> Self {
@ -168,24 +183,24 @@ impl MemoryItem {
}
}
pub fn call_fn(
pub async fn call_fn(
&self,
args: &[MemoryItem],
memory: &ProgramMemory,
engine: &mut EngineConnection,
args: Vec<MemoryItem>,
memory: ProgramMemory,
engine: EngineConnection,
) -> Result<Option<ProgramReturn>, KclError> {
if let MemoryItem::Function { func, expression, meta } = self {
if let MemoryItem::Function { func, expression, meta } = &self {
if let Some(func) = func {
func(args, memory, expression, meta, engine)
func(args, memory, expression.clone(), meta.clone(), engine).await
} else {
Err(KclError::Semantic(KclErrorDetails {
message: format!("Not a function: {:?}", self),
message: format!("Not a function: {:?}", expression),
source_ranges: vec![],
}))
}
} else {
Err(KclError::Semantic(KclErrorDetails {
message: format!("not a function: {:?}", self),
message: "not a in memory function".to_string(),
source_ranges: vec![],
}))
}
@ -579,11 +594,11 @@ impl Default for PipeInfo {
}
/// Execute a AST's program.
pub fn execute(
pub async fn execute(
program: crate::ast::types::Program,
memory: &mut ProgramMemory,
options: BodyType,
engine: &mut EngineConnection,
engine: &EngineConnection,
) -> Result<ProgramMemory, KclError> {
let mut pipe_info = PipeInfo::default();
@ -602,7 +617,23 @@ pub fn execute(
args.push(memory_item.clone());
}
Value::CallExpression(call_expr) => {
let result = call_expr.execute(memory, &mut pipe_info, engine)?;
let result = call_expr.execute(memory, &mut pipe_info, engine).await?;
args.push(result);
}
Value::BinaryExpression(binary_expression) => {
let result = binary_expression.get_result(memory, &mut pipe_info, engine).await?;
args.push(result);
}
Value::UnaryExpression(unary_expression) => {
let result = unary_expression.get_result(memory, &mut pipe_info, engine).await?;
args.push(result);
}
Value::ObjectExpression(object_expression) => {
let result = object_expression.execute(memory, &mut pipe_info, engine).await?;
args.push(result);
}
Value::ArrayExpression(array_expression) => {
let result = array_expression.execute(memory, &mut pipe_info, engine).await?;
args.push(result);
}
// We do nothing for the rest.
@ -620,7 +651,7 @@ pub fn execute(
memory.return_ = Some(ProgramReturn::Arguments(call_expr.arguments.clone()));
} else if let Some(func) = memory.clone().root.get(&fn_name) {
let result = func.call_fn(&args, memory, engine)?;
let result = func.call_fn(args.clone(), memory.clone(), engine.clone()).await?;
memory.return_ = result;
} else {
@ -646,22 +677,27 @@ pub fn execute(
memory.add(&var_name, value.clone(), source_range)?;
}
Value::BinaryExpression(binary_expression) => {
let result = binary_expression.get_result(memory, &mut pipe_info, engine)?;
let result = binary_expression.get_result(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?;
}
Value::FunctionExpression(function_expression) => {
memory.add(
&var_name,
MemoryItem::Function{
expression: function_expression.clone(),
meta: vec![metadata],
func: Some(|args: &[MemoryItem], memory: &ProgramMemory, function_expression: &FunctionExpression, _metadata: &[Metadata], engine: &mut EngineConnection| -> Result<Option<ProgramReturn>, KclError> {
let mem_func = force_memory_function(
|args: Vec<MemoryItem>,
memory: ProgramMemory,
function_expression: Box<FunctionExpression>,
_metadata: Vec<Metadata>,
engine: EngineConnection| {
Box::pin(async move {
let mut fn_memory = memory.clone();
if args.len() != function_expression.params.len() {
return Err(KclError::Semantic(KclErrorDetails {
message: format!("Expected {} arguments, got {}", function_expression.params.len(), args.len()),
source_ranges: vec![function_expression.into()],
message: format!(
"Expected {} arguments, got {}",
function_expression.params.len(),
args.len(),
),
source_ranges: vec![(&function_expression).into()],
}));
}
@ -674,20 +710,34 @@ pub fn execute(
)?;
}
let result = execute(function_expression.body.clone(), &mut fn_memory, BodyType::Block, engine)?;
let result = execute(
function_expression.body.clone(),
&mut fn_memory,
BodyType::Block,
&engine,
)
.await?;
Ok(result.return_)
})
},
);
memory.add(
&var_name,
MemoryItem::Function {
expression: function_expression.clone(),
meta: vec![metadata],
func: Some(mem_func),
},
source_range,
)?;
}
Value::CallExpression(call_expression) => {
let result = call_expression.execute(memory, &mut pipe_info, engine)?;
let result = call_expression.execute(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?;
}
Value::PipeExpression(pipe_expression) => {
let result = pipe_expression.get_result(memory, &mut pipe_info, engine)?;
let result = pipe_expression.get_result(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?;
}
Value::PipeSubstitution(pipe_substitution) => {
@ -700,11 +750,11 @@ pub fn execute(
}));
}
Value::ArrayExpression(array_expression) => {
let result = array_expression.execute(memory, &mut pipe_info, engine)?;
let result = array_expression.execute(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?;
}
Value::ObjectExpression(object_expression) => {
let result = object_expression.execute(memory, &mut pipe_info, engine)?;
let result = object_expression.execute(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?;
}
Value::MemberExpression(member_expression) => {
@ -712,7 +762,7 @@ pub fn execute(
memory.add(&var_name, result, source_range)?;
}
Value::UnaryExpression(unary_expression) => {
let result = unary_expression.get_result(memory, &mut pipe_info, engine)?;
let result = unary_expression.get_result(memory, &mut pipe_info, engine).await?;
memory.add(&var_name, result, source_range)?;
}
}
@ -720,11 +770,11 @@ pub fn execute(
}
BodyItem::ReturnStatement(return_statement) => match &return_statement.argument {
Value::BinaryExpression(bin_expr) => {
let result = bin_expr.get_result(memory, &mut pipe_info, engine)?;
let result = bin_expr.get_result(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result));
}
Value::UnaryExpression(unary_expr) => {
let result = unary_expr.get_result(memory, &mut pipe_info, engine)?;
let result = unary_expr.get_result(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result));
}
Value::Identifier(identifier) => {
@ -735,15 +785,15 @@ pub fn execute(
memory.return_ = Some(ProgramReturn::Value(literal.into()));
}
Value::ArrayExpression(array_expr) => {
let result = array_expr.execute(memory, &mut pipe_info, engine)?;
let result = array_expr.execute(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result));
}
Value::ObjectExpression(obj_expr) => {
let result = obj_expr.execute(memory, &mut pipe_info, engine)?;
let result = obj_expr.execute(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result));
}
Value::CallExpression(call_expr) => {
let result = call_expr.execute(memory, &mut pipe_info, engine)?;
let result = call_expr.execute(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result));
}
Value::MemberExpression(member_expr) => {
@ -751,7 +801,7 @@ pub fn execute(
memory.return_ = Some(ProgramReturn::Value(result));
}
Value::PipeExpression(pipe_expr) => {
let result = pipe_expr.get_result(memory, &mut pipe_info, engine)?;
let result = pipe_expr.get_result(memory, &mut pipe_info, engine).await?;
memory.return_ = Some(ProgramReturn::Value(result));
}
Value::PipeSubstitution(_) => {}
@ -770,12 +820,12 @@ mod tests {
use super::*;
pub async fn parse_execute(code: &str) -> Result<ProgramMemory> {
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = crate::parser::Parser::new(tokens);
let program = parser.ast()?;
let mut mem: ProgramMemory = Default::default();
let mut engine = EngineConnection::new().await?;
let memory = execute(program, &mut mem, BodyType::Root, &mut engine)?;
let engine = EngineConnection::new().await?;
let memory = execute(program, &mut mem, BodyType::Root, &engine).await?;
Ok(memory)
}

View File

@ -9,4 +9,4 @@ pub mod math_parser;
pub mod parser;
pub mod server;
pub mod std;
pub mod tokeniser;
pub mod token;

View File

@ -10,8 +10,8 @@ use crate::{
},
errors::{KclError, KclErrorDetails},
executor::SourceRange,
parser::{is_not_code_token, Parser},
tokeniser::{Token, TokenType},
parser::Parser,
token::{Token, TokenType},
};
#[derive(Debug, PartialEq, Eq, Deserialize, Serialize, Clone, ts_rs::TS)]
@ -334,7 +334,7 @@ impl ReversePolishNotation {
return rpn.parse();
}
if is_not_code_token(current_token) {
if !current_token.is_code_token() {
let rpn = ReversePolishNotation::new(&self.parser.tokens[1..], &self.previous_postfix, &self.operators);
return rpn.parse();
}
@ -704,7 +704,7 @@ mod test {
#[test]
fn test_parse_expression() {
let tokens = crate::tokeniser::lexer("1 + 2");
let tokens = crate::token::lexer("1 + 2");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -731,7 +731,7 @@ mod test {
#[test]
fn test_parse_expression_add_no_spaces() {
let tokens = crate::tokeniser::lexer("1+2");
let tokens = crate::token::lexer("1+2");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -758,7 +758,7 @@ mod test {
#[test]
fn test_parse_expression_sub_no_spaces() {
let tokens = crate::tokeniser::lexer("1 -2");
let tokens = crate::token::lexer("1 -2");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -785,7 +785,7 @@ mod test {
#[test]
fn test_parse_expression_plus_followed_by_star() {
let tokens = crate::tokeniser::lexer("1 + 2 * 3");
let tokens = crate::token::lexer("1 + 2 * 3");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -823,7 +823,7 @@ mod test {
#[test]
fn test_parse_expression_with_parentheses() {
let tokens = crate::tokeniser::lexer("1 * ( 2 + 3 )");
let tokens = crate::token::lexer("1 * ( 2 + 3 )");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -861,7 +861,7 @@ mod test {
#[test]
fn test_parse_expression_parens_in_middle() {
let tokens = crate::tokeniser::lexer("1 * ( 2 + 3 ) / 4");
let tokens = crate::token::lexer("1 * ( 2 + 3 ) / 4");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -910,7 +910,7 @@ mod test {
#[test]
fn test_parse_expression_parans_and_predence() {
let tokens = crate::tokeniser::lexer("1 + ( 2 + 3 ) / 4");
let tokens = crate::token::lexer("1 + ( 2 + 3 ) / 4");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -958,7 +958,7 @@ mod test {
}
#[test]
fn test_parse_expression_nested() {
let tokens = crate::tokeniser::lexer("1 * (( 2 + 3 ) / 4 + 5 )");
let tokens = crate::token::lexer("1 * (( 2 + 3 ) / 4 + 5 )");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -1017,7 +1017,7 @@ mod test {
}
#[test]
fn test_parse_expression_redundant_braces() {
let tokens = crate::tokeniser::lexer("1 * ((( 2 + 3 )))");
let tokens = crate::token::lexer("1 * ((( 2 + 3 )))");
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -1055,7 +1055,7 @@ mod test {
#[test]
fn test_reverse_polish_notation_simple() {
let parser = ReversePolishNotation::new(&crate::tokeniser::lexer("1 + 2"), &[], &[]);
let parser = ReversePolishNotation::new(&crate::token::lexer("1 + 2"), &[], &[]);
let result = parser.parse().unwrap();
assert_eq!(
result,
@ -1084,7 +1084,7 @@ mod test {
#[test]
fn test_reverse_polish_notation_complex() {
let parser = ReversePolishNotation::new(&crate::tokeniser::lexer("1 + 2 * 3"), &[], &[]);
let parser = ReversePolishNotation::new(&crate::token::lexer("1 + 2 * 3"), &[], &[]);
let result = parser.parse().unwrap();
assert_eq!(
result,
@ -1125,7 +1125,7 @@ mod test {
#[test]
fn test_reverse_polish_notation_complex_with_parentheses() {
let parser = ReversePolishNotation::new(&crate::tokeniser::lexer("1 * ( 2 + 3 )"), &[], &[]);
let parser = ReversePolishNotation::new(&crate::token::lexer("1 * ( 2 + 3 )"), &[], &[]);
let result = parser.parse().unwrap();
assert_eq!(
result,
@ -1179,7 +1179,7 @@ mod test {
#[test]
fn test_parse_expression_redundant_braces_around_literal() {
let code = "2 + (((3)))";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let mut parser = MathParser::new(&tokens);
let result = parser.parse().unwrap();
assert_eq!(
@ -1274,7 +1274,7 @@ mod test {
#[test]
fn test_parse_expression_braces_around_lots_of_math() {
let code = "(distance * p * FOS * 6 / (sigmaAllow * width))";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let mut parser = MathParser::new(&tokens);
let result = parser.parse();
assert!(result.is_ok());
@ -1283,7 +1283,7 @@ mod test {
#[test]
fn test_parse_expression_braces_around_internals_lots_of_math() {
let code = "distance * p * FOS * 6 / (sigmaAllow * width)";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let mut parser = MathParser::new(&tokens);
let result = parser.parse();
assert!(result.is_ok());

View File

@ -10,7 +10,7 @@ use crate::{
},
errors::{KclError, KclErrorDetails},
math_parser::MathParser,
tokeniser::{Token, TokenType},
token::{Token, TokenType},
};
pub const PIPE_SUBSTITUTION_OPERATOR: &str = "%";
@ -249,7 +249,7 @@ impl Parser {
}
let current_token = self.get_token(index)?;
if is_not_code_token(current_token) {
if !current_token.is_code_token() {
return self.find_end_of_non_code_node(index + 1);
}
@ -262,7 +262,7 @@ impl Parser {
}
let current_token = self.get_token(index)?;
if is_not_code_token(current_token) {
if !current_token.is_code_token() {
return self.find_start_of_non_code_node(index - 1);
}
@ -365,7 +365,7 @@ impl Parser {
});
};
if is_not_code_token(token) {
if !token.is_code_token() {
let non_code_node = self.make_non_code_node(new_index)?;
let new_new_index = non_code_node.1 + 1;
let bonus_non_code_node = non_code_node.0;
@ -1623,7 +1623,7 @@ impl Parser {
});
}
if is_not_code_token(token) {
if !token.is_code_token() {
let next_token = self.next_meaningful_token(token_index, Some(0))?;
if let Some(node) = &next_token.non_code_node {
if previous_body.is_empty() {
@ -1788,12 +1788,6 @@ impl Parser {
}
}
pub fn is_not_code_token(token: &Token) -> bool {
token.token_type == TokenType::Whitespace
|| token.token_type == TokenType::LineComment
|| token.token_type == TokenType::BlockComment
}
#[cfg(test)]
mod tests {
use pretty_assertions::assert_eq;
@ -1803,7 +1797,7 @@ mod tests {
#[test]
fn test_make_identifier() {
let tokens = crate::tokeniser::lexer("a");
let tokens = crate::token::lexer("a");
let parser = Parser::new(tokens);
let identifier = parser.make_identifier(0).unwrap();
assert_eq!(
@ -1818,7 +1812,7 @@ mod tests {
#[test]
fn test_make_identifier_with_const_myvar_equals_5_and_index_2() {
let tokens = crate::tokeniser::lexer("const myVar = 5");
let tokens = crate::token::lexer("const myVar = 5");
let parser = Parser::new(tokens);
let identifier = parser.make_identifier(2).unwrap();
assert_eq!(
@ -1833,7 +1827,7 @@ mod tests {
#[test]
fn test_make_identifier_multiline() {
let tokens = crate::tokeniser::lexer("const myVar = 5\nconst newVar = myVar + 1");
let tokens = crate::token::lexer("const myVar = 5\nconst newVar = myVar + 1");
let parser = Parser::new(tokens);
let identifier = parser.make_identifier(2).unwrap();
assert_eq!(
@ -1857,7 +1851,7 @@ mod tests {
#[test]
fn test_make_identifier_call_expression() {
let tokens = crate::tokeniser::lexer("log(5, \"hello\", aIdentifier)");
let tokens = crate::token::lexer("log(5, \"hello\", aIdentifier)");
let parser = Parser::new(tokens);
let identifier = parser.make_identifier(0).unwrap();
assert_eq!(
@ -1880,7 +1874,7 @@ mod tests {
}
#[test]
fn test_make_non_code_node() {
let tokens = crate::tokeniser::lexer("log(5, \"hello\", aIdentifier)");
let tokens = crate::token::lexer("log(5, \"hello\", aIdentifier)");
let parser = Parser::new(tokens);
let index = 4;
let expected_output = (None, 4);
@ -1889,7 +1883,7 @@ mod tests {
let index = 7;
let expected_output = (None, 7);
assert_eq!(parser.make_non_code_node(index).unwrap(), expected_output);
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"
const yo = { a: { b: { c: '123' } } }
// this is a comment
@ -1920,7 +1914,7 @@ const key = 'c'"#,
31,
);
assert_eq!(parser.make_non_code_node(index).unwrap(), expected_output);
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is
@ -1946,7 +1940,7 @@ const key = 'c'"#,
#[test]
fn test_collect_object_keys() {
let tokens = crate::tokeniser::lexer("const prop = yo.one[\"two\"]");
let tokens = crate::token::lexer("const prop = yo.one[\"two\"]");
let parser = Parser::new(tokens);
let keys_info = parser.collect_object_keys(6, None, false).unwrap();
assert_eq!(keys_info.len(), 2);
@ -1966,7 +1960,7 @@ const key = 'c'"#,
#[test]
fn test_make_literal_call_expression() {
let tokens = crate::tokeniser::lexer("log(5, \"hello\", aIdentifier)");
let tokens = crate::token::lexer("log(5, \"hello\", aIdentifier)");
let parser = Parser::new(tokens);
let literal = parser.make_literal(2).unwrap();
assert_eq!(
@ -1990,74 +1984,88 @@ const key = 'c'"#,
);
}
#[test]
fn test_is_code_token() {
let tokens = [
Token {
token_type: TokenType::Word,
start: 0,
end: 3,
value: "log".to_string(),
},
Token {
token_type: TokenType::Brace,
start: 3,
end: 4,
value: "(".to_string(),
},
Token {
token_type: TokenType::Number,
start: 4,
end: 5,
value: "5".to_string(),
},
Token {
token_type: TokenType::Comma,
start: 5,
end: 6,
value: ",".to_string(),
},
Token {
token_type: TokenType::String,
start: 7,
end: 14,
value: "\"hello\"".to_string(),
},
Token {
token_type: TokenType::Word,
start: 16,
end: 27,
value: "aIdentifier".to_string(),
},
Token {
token_type: TokenType::Brace,
start: 27,
end: 28,
value: ")".to_string(),
},
];
for (i, token) in tokens.iter().enumerate() {
assert!(token.is_code_token(), "failed test {i}: {token:?}")
}
}
#[test]
fn test_is_not_code_token() {
assert!(!is_not_code_token(&Token {
token_type: TokenType::Word,
start: 0,
end: 3,
value: "log".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::Brace,
start: 3,
end: 4,
value: "(".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::Number,
start: 4,
end: 5,
value: "5".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::Comma,
start: 5,
end: 6,
value: ",".to_string(),
}));
assert!(is_not_code_token(&Token {
token_type: TokenType::Whitespace,
start: 6,
end: 7,
value: " ".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::String,
start: 7,
end: 14,
value: "\"hello\"".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::Word,
start: 16,
end: 27,
value: "aIdentifier".to_string(),
}));
assert!(!is_not_code_token(&Token {
token_type: TokenType::Brace,
start: 27,
end: 28,
value: ")".to_string(),
}));
assert!(is_not_code_token(&Token {
token_type: TokenType::BlockComment,
start: 28,
end: 30,
value: "/* abte */".to_string(),
}));
assert!(is_not_code_token(&Token {
token_type: TokenType::LineComment,
start: 30,
end: 33,
value: "// yoyo a line".to_string(),
}));
let tokens = [
Token {
token_type: TokenType::Whitespace,
start: 6,
end: 7,
value: " ".to_string(),
},
Token {
token_type: TokenType::BlockComment,
start: 28,
end: 30,
value: "/* abte */".to_string(),
},
Token {
token_type: TokenType::LineComment,
start: 30,
end: 33,
value: "// yoyo a line".to_string(),
},
];
for (i, token) in tokens.iter().enumerate() {
assert!(!token.is_code_token(), "failed test {i}: {token:?}")
}
}
#[test]
fn test_next_meaningful_token() {
let _offset = 1;
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is
@ -2443,7 +2451,7 @@ const key = 'c'"#,
#[test]
fn test_find_closing_brace() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is
@ -2460,16 +2468,16 @@ const key = 'c'"#,
assert_eq!(parser.find_closing_brace(90, 0, "").unwrap(), 92);
let basic = "( hey )";
let parser = Parser::new(crate::tokeniser::lexer(basic));
let parser = Parser::new(crate::token::lexer(basic));
assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 4);
let handles_non_zero_index = "(indexForBracketToRightOfThisIsTwo(shouldBeFour)AndNotThisSix)";
let parser = Parser::new(crate::tokeniser::lexer(handles_non_zero_index));
let parser = Parser::new(crate::token::lexer(handles_non_zero_index));
assert_eq!(parser.find_closing_brace(2, 0, "").unwrap(), 4);
assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 6);
let handles_nested = "{a{b{c(}d]}eathou athoeu tah u} thatOneToTheLeftIsLast }";
let parser = Parser::new(crate::tokeniser::lexer(handles_nested));
let parser = Parser::new(crate::token::lexer(handles_nested));
assert_eq!(parser.find_closing_brace(0, 0, "").unwrap(), 18);
// TODO expect error when not started on a brace
@ -2477,7 +2485,7 @@ const key = 'c'"#,
#[test]
fn test_is_call_expression() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is
@ -2498,7 +2506,7 @@ const key = 'c'"#,
#[test]
fn test_find_next_declaration_keyword() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const mySketch = startSketchAt([0,0])
|> lineTo({ to: [0, 1], tag: 'myPath' }, %)
|> lineTo([1, 1], %) /* this is
@ -2513,7 +2521,7 @@ const key = 'c'"#,
TokenReturn { token: None, index: 92 }
);
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const myVar = 5
const newVar = myVar + 1
"#,
@ -2543,7 +2551,7 @@ const newVar = myVar + 1
lineTo(2, 3)
} |> rx(45, %)
"#;
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens);
assert_eq!(
parser.has_pipe_operator(0, None).unwrap(),
@ -2562,7 +2570,7 @@ const newVar = myVar + 1
lineTo(2, 3)
} |> rx(45, %) |> rx(45, %)
"#;
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens);
assert_eq!(
parser.has_pipe_operator(0, None).unwrap(),
@ -2584,7 +2592,7 @@ const newVar = myVar + 1
const yo = myFunc(9()
|> rx(45, %)
"#;
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens);
assert_eq!(
parser.has_pipe_operator(0, None).unwrap(),
@ -2596,7 +2604,7 @@ const yo = myFunc(9()
);
let code = "const myVar2 = 5 + 1 |> myFn(%)";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens);
assert_eq!(
parser.has_pipe_operator(1, None).unwrap(),
@ -2618,7 +2626,7 @@ const yo = myFunc(9()
lineTo(1,1)
} |> rx(90, %)
show(mySk1)"#;
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let token_with_my_path_index = tokens.iter().position(|token| token.value == "myPath").unwrap();
// loop through getting the token and it's index
@ -2658,7 +2666,7 @@ show(mySk1)"#;
#[test]
fn test_make_member_expression() {
let tokens = crate::tokeniser::lexer("const prop = yo.one[\"two\"]");
let tokens = crate::token::lexer("const prop = yo.one[\"two\"]");
let parser = Parser::new(tokens);
let member_expression_return = parser.make_member_expression(6).unwrap();
let member_expression = member_expression_return.expression;
@ -2700,63 +2708,63 @@ show(mySk1)"#;
#[test]
fn test_find_end_of_binary_expression() {
let code = "1 + 2 * 3\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].value, "3");
let code = "(1 + 25) / 5 - 3\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].value, "3");
let index_of_5 = code.find('5').unwrap();
let end_starting_at_the_5 = parser.find_end_of_binary_expression(index_of_5).unwrap();
assert_eq!(end_starting_at_the_5, end);
// whole thing wraped
// whole thing wrapped
let code = "((1 + 2) / 5 - 3)\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].end, code.find("3)").unwrap() + 2);
// whole thing wraped but given index after the first brace
// whole thing wrapped but given index after the first brace
let code = "((1 + 2) / 5 - 3)\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(1).unwrap();
assert_eq!(tokens[end].value, "3");
// given the index of a small wrapped section i.e. `1 + 2` in ((1 + 2) / 5 - 3)'
let code = "((1 + 2) / 5 - 3)\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(2).unwrap();
assert_eq!(tokens[end].value, "2");
// lots of silly nesting
let code = "(1 + 2) / (5 - (3))\nconst yo = 5";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].end, code.find("))").unwrap() + 2);
// with pipe operator at the end
let code = "(1 + 2) / (5 - (3))\n |> fn(%)";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].end, code.find("))").unwrap() + 2);
// with call expression at the start of binary expression
let code = "yo(2) + 3\n |> fn(%)";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].value, "3");
// with call expression at the end of binary expression
let code = "3 + yo(2)\n |> fn(%)";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens);
let _end = parser.find_end_of_binary_expression(0).unwrap();
// with call expression at the end of binary expression
let code = "-legX + 2, ";
let tokens = crate::tokeniser::lexer(code);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens.clone());
let end = parser.find_end_of_binary_expression(0).unwrap();
assert_eq!(tokens[end].value, "2");
@ -2765,7 +2773,7 @@ show(mySk1)"#;
#[test]
fn test_make_array_expression() {
// input_index: 6, output_index: 14, output: {"type":"ArrayExpression","start":11,"end":26,"elements":[{"type":"Literal","start":12,"end":15,"value":"1","raw":"\"1\""},{"type":"Literal","start":17,"end":18,"value":2,"raw":"2"},{"type":"Identifier","start":20,"end":25,"name":"three"}]}
let tokens = crate::tokeniser::lexer("const yo = [\"1\", 2, three]");
let tokens = crate::token::lexer("const yo = [\"1\", 2, three]");
let parser = Parser::new(tokens);
let array_expression = parser.make_array_expression(6).unwrap();
let expression = array_expression.expression;
@ -2804,7 +2812,7 @@ show(mySk1)"#;
#[test]
fn test_make_call_expression() {
let tokens = crate::tokeniser::lexer("foo(\"a\", a, 3)");
let tokens = crate::token::lexer("foo(\"a\", a, 3)");
let parser = Parser::new(tokens);
let result = parser.make_call_expression(0).unwrap();
assert_eq!(result.last_index, 9);
@ -2838,7 +2846,7 @@ show(mySk1)"#;
#[test]
fn test_make_variable_declaration() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const yo = startSketch([0, 0])
|> lineTo([1, myVar], %)
|> foo(myVar2, %)
@ -2908,7 +2916,7 @@ show(mySk1)"#;
#[test]
fn test_make_body() {
let tokens = crate::tokeniser::lexer("const myVar = 5");
let tokens = crate::token::lexer("const myVar = 5");
let parser = Parser::new(tokens);
let body = parser
.make_body(
@ -2926,7 +2934,7 @@ show(mySk1)"#;
#[test]
fn test_abstract_syntax_tree() {
let code = "5 +6";
let parser = Parser::new(crate::tokeniser::lexer(code));
let parser = Parser::new(crate::token::lexer(code));
let result = parser.ast().unwrap();
let expected_result = Program {
start: 0,
@ -2964,8 +2972,8 @@ show(mySk1)"#;
#[test]
fn test_empty_file() {
let some_program_string = r#""#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert!(result.err().unwrap().to_string().contains("file is empty"));
@ -2973,7 +2981,7 @@ show(mySk1)"#;
#[test]
fn test_parse_half_pipe_small() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
"const secondExtrude = startSketchAt([0,0])
|",
);
@ -2985,14 +2993,14 @@ show(mySk1)"#;
#[test]
fn test_parse_member_expression_double_nested_braces() {
let tokens = crate::tokeniser::lexer(r#"const prop = yo["one"][two]"#);
let tokens = crate::token::lexer(r#"const prop = yo["one"][two]"#);
let parser = Parser::new(tokens);
parser.ast().unwrap();
}
#[test]
fn test_parse_member_expression_binary_expression_period_number_first() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 }
const height = 1 - obj.a"#,
);
@ -3002,7 +3010,7 @@ const height = 1 - obj.a"#,
#[test]
fn test_parse_member_expression_binary_expression_brace_number_first() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 }
const height = 1 - obj["a"]"#,
);
@ -3012,7 +3020,7 @@ const height = 1 - obj["a"]"#,
#[test]
fn test_parse_member_expression_binary_expression_brace_number_second() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 }
const height = obj["a"] - 1"#,
);
@ -3022,7 +3030,7 @@ const height = obj["a"] - 1"#,
#[test]
fn test_parse_member_expression_binary_expression_in_array_number_first() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 }
const height = [1 - obj["a"], 0]"#,
);
@ -3032,7 +3040,7 @@ const height = [1 - obj["a"], 0]"#,
#[test]
fn test_parse_member_expression_binary_expression_in_array_number_second() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 }
const height = [obj["a"] - 1, 0]"#,
);
@ -3042,7 +3050,7 @@ const height = [obj["a"] - 1, 0]"#,
#[test]
fn test_parse_member_expression_binary_expression_in_array_number_second_missing_space() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const obj = { a: 1, b: 2 }
const height = [obj["a"] -1, 0]"#,
);
@ -3052,7 +3060,7 @@ const height = [obj["a"] -1, 0]"#,
#[test]
fn test_parse_half_pipe() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
"const height = 10
const firstExtrude = startSketchAt([0,0])
@ -3075,15 +3083,17 @@ const secondExtrude = startSketchAt([0,0])
#[test]
fn test_parse_greater_bang() {
let tokens = crate::tokeniser::lexer(">!");
let tokens = crate::token::lexer(">!");
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_ok());
let err = parser.ast().unwrap_err();
// TODO: Better errors when program cannot tokenize.
// https://github.com/KittyCAD/modeling-app/issues/696
assert!(err.to_string().contains("file is empty"));
}
#[test]
fn test_parse_z_percent_parens() {
let tokens = crate::tokeniser::lexer("z%)");
let tokens = crate::token::lexer("z%)");
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
@ -3092,15 +3102,17 @@ const secondExtrude = startSketchAt([0,0])
#[test]
fn test_parse_parens_unicode() {
let tokens = crate::tokeniser::lexer("");
let tokens = crate::token::lexer("");
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_ok());
// TODO: Better errors when program cannot tokenize.
// https://github.com/KittyCAD/modeling-app/issues/696
assert!(result.is_err());
}
#[test]
fn test_parse_negative_in_array_binary_expression() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"const leg1 = 5
const thickness = 0.56
@ -3114,7 +3126,7 @@ const bracket = [-leg2 + thickness, 0]
#[test]
fn test_parse_nested_open_brackets() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"
z(-[["#,
);
@ -3129,31 +3141,38 @@ z(-[["#,
#[test]
fn test_parse_weird_new_line_function() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"z
(--#"#,
);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
// TODO: Better errors when program cannot tokenize.
// https://github.com/KittyCAD/modeling-app/issues/696
assert_eq!(
result.err().unwrap().to_string(),
r#"syntax: KclErrorDetails { source_ranges: [SourceRange([0, 1])], message: "missing a closing brace for the function call" }"#
r#"semantic: KclErrorDetails { source_ranges: [], message: "file is empty" }"#
);
}
#[test]
fn test_parse_weird_lots_of_fancy_brackets() {
let tokens = crate::tokeniser::lexer(r#"zz({{{{{{{{)iegAng{{{{{{{##"#);
let tokens = crate::token::lexer(r#"zz({{{{{{{{)iegAng{{{{{{{##"#);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert!(result.err().unwrap().to_string().contains("unexpected end"));
// TODO: Better errors when program cannot tokenize.
// https://github.com/KittyCAD/modeling-app/issues/696
assert_eq!(
result.err().unwrap().to_string(),
r#"semantic: KclErrorDetails { source_ranges: [], message: "file is empty" }"#
);
}
#[test]
fn test_parse_weird_close_before_open() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"fn)n
e
["#,
@ -3170,7 +3189,7 @@ e
#[test]
fn test_parse_weird_close_before_nada() {
let tokens = crate::tokeniser::lexer(r#"fn)n-"#);
let tokens = crate::token::lexer(r#"fn)n-"#);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
@ -3179,7 +3198,7 @@ e
#[test]
fn test_parse_weird_lots_of_slashes() {
let tokens = crate::tokeniser::lexer(
let tokens = crate::token::lexer(
r#"J///////////o//+///////////P++++*++++++P///////˟
++4"#,
);
@ -3196,7 +3215,7 @@ e
#[test]
fn test_parse_expand_array() {
let code = "const myArray = [0..10]";
let parser = Parser::new(crate::tokeniser::lexer(code));
let parser = Parser::new(crate::token::lexer(code));
let result = parser.ast().unwrap();
let expected_result = Program {
start: 0,
@ -3299,8 +3318,8 @@ e
#[test]
fn test_error_keyword_in_variable() {
let some_program_string = r#"const let = "thing""#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert_eq!(
@ -3312,8 +3331,8 @@ e
#[test]
fn test_error_keyword_in_fn_name() {
let some_program_string = r#"fn let = () {}"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert_eq!(
@ -3325,8 +3344,8 @@ e
#[test]
fn test_error_stdlib_in_fn_name() {
let some_program_string = r#"fn cos = () {}"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert_eq!(
@ -3340,8 +3359,8 @@ e
let some_program_string = r#"fn thing = (let) => {
return 1
}"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert_eq!(
@ -3355,8 +3374,8 @@ e
let some_program_string = r#"fn thing = (cos) => {
return 1
}"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert_eq!(
@ -3373,8 +3392,8 @@ e
}
firstPrimeNumber()
"#;
let tokens = crate::tokeniser::lexer(program);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(program);
let parser = Parser::new(tokens);
let _ast = parser.ast().unwrap();
}
@ -3386,8 +3405,8 @@ e
thing(false)
"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
parser.ast().unwrap();
}
@ -3403,8 +3422,8 @@ thing(false)
"#,
name
);
let tokens = crate::tokeniser::lexer(&some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(&some_program_string);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert_eq!(
@ -3421,8 +3440,8 @@ thing(false)
#[test]
fn test_error_define_var_as_function() {
let some_program_string = r#"fn thing = "thing""#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
let result = parser.ast();
assert!(result.is_err());
assert_eq!(
@ -3450,8 +3469,8 @@ const pt2 = b2[0]
show(b1)
show(b2)"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
parser.ast().unwrap();
}
@ -3459,18 +3478,36 @@ show(b2)"#;
fn test_math_with_stdlib() {
let some_program_string = r#"const d2r = pi() / 2
let other_thing = 2 * cos(3)"#;
let tokens = crate::tokeniser::lexer(some_program_string);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
parser.ast().unwrap();
}
#[test]
#[ignore] // ignore until more stack fixes
fn test_parse_pipes_on_pipes() {
let code = include_str!("../../tests/executor/inputs/pipes_on_pipes.kcl");
let tokens = crate::tokeniser::lexer(code);
let parser = crate::parser::Parser::new(tokens);
let tokens = crate::token::lexer(code);
let parser = Parser::new(tokens);
parser.ast().unwrap();
}
#[test]
fn test_negative_arguments() {
let some_program_string = r#"fn box = (p, h, l, w) => {
const myBox = startSketchAt(p)
|> line([0, l], %)
|> line([w, 0], %)
|> line([0, -l], %)
|> close(%)
|> extrude(h, %)
return myBox
}
let myBox = box([0,0], -3, -16, -10)
show(myBox)"#;
let tokens = crate::token::lexer(some_program_string);
let parser = Parser::new(tokens);
parser.ast().unwrap();
}
}

View File

@ -34,7 +34,7 @@ pub struct Backend {
/// The types of tokens the server supports.
pub token_types: Vec<SemanticTokenType>,
/// Token maps.
pub token_map: DashMap<String, Vec<crate::tokeniser::Token>>,
pub token_map: DashMap<String, Vec<crate::token::Token>>,
/// AST maps.
pub ast_map: DashMap<String, crate::ast::types::Program>,
/// Current code.
@ -56,7 +56,7 @@ impl Backend {
// Lets update the tokens.
self.current_code_map
.insert(params.uri.to_string(), params.text.clone());
let tokens = crate::tokeniser::lexer(&params.text);
let tokens = crate::token::lexer(&params.text);
self.token_map.insert(params.uri.to_string(), tokens.clone());
// Update the semantic tokens map.
@ -69,9 +69,7 @@ impl Backend {
continue;
};
if token.token_type == crate::tokeniser::TokenType::Word
&& self.stdlib_completions.contains_key(&token.value)
{
if token.token_type == crate::token::TokenType::Word && self.stdlib_completions.contains_key(&token.value) {
// This is a stdlib function.
token_type = SemanticTokenType::FUNCTION;
}
@ -549,7 +547,7 @@ impl LanguageServer for Backend {
// Parse the ast.
// I don't know if we need to do this again since it should be updated in the context.
// But I figure better safe than sorry since this will write back out to the file.
let tokens = crate::tokeniser::lexer(&current_code);
let tokens = crate::token::lexer(&current_code);
let parser = crate::parser::Parser::new(tokens);
let Ok(ast) = parser.ast() else {
return Ok(None);
@ -581,7 +579,7 @@ impl LanguageServer for Backend {
// Parse the ast.
// I don't know if we need to do this again since it should be updated in the context.
// But I figure better safe than sorry since this will write back out to the file.
let tokens = crate::tokeniser::lexer(&current_code);
let tokens = crate::token::lexer(&current_code);
let parser = crate::parser::Parser::new(tokens);
let Ok(mut ast) = parser.ast() else {
return Ok(None);

View File

@ -11,10 +11,10 @@ use crate::{
};
/// Extrudes by a given amount.
pub fn extrude(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn extrude(args: Args) -> Result<MemoryItem, KclError> {
let (length, sketch_group) = args.get_number_sketch_group()?;
let result = inner_extrude(length, sketch_group, args)?;
let result = inner_extrude(length, sketch_group, args).await?;
Ok(MemoryItem::ExtrudeGroup(result))
}
@ -23,7 +23,7 @@ pub fn extrude(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "extrude"
}]
fn inner_extrude(length: f64, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<ExtrudeGroup>, KclError> {
async fn inner_extrude(length: f64, sketch_group: Box<SketchGroup>, args: Args) -> Result<Box<ExtrudeGroup>, KclError> {
let id = uuid::Uuid::new_v4();
let cmd = kittycad::types::ModelingCmd::Extrude {
@ -31,7 +31,7 @@ fn inner_extrude(length: f64, sketch_group: Box<SketchGroup>, args: &mut Args) -
distance: length,
cap: true,
};
args.send_modeling_cmd(id, cmd)?;
args.send_modeling_cmd(id, cmd).await?;
Ok(Box::new(ExtrudeGroup {
id,
@ -46,7 +46,7 @@ fn inner_extrude(length: f64, sketch_group: Box<SketchGroup>, args: &mut Args) -
}
/// Returns the extrude wall transform.
pub fn get_extrude_wall_transform(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn get_extrude_wall_transform(args: Args) -> Result<MemoryItem, KclError> {
let (surface_name, extrude_group) = args.get_path_name_extrude_group()?;
let result = inner_get_extrude_wall_transform(&surface_name, *extrude_group, args)?;
Ok(MemoryItem::ExtrudeTransform(result))
@ -59,8 +59,8 @@ pub fn get_extrude_wall_transform(args: &mut Args) -> Result<MemoryItem, KclErro
fn inner_get_extrude_wall_transform(
surface_name: &str,
extrude_group: ExtrudeGroup,
args: &mut Args,
) -> Result<ExtrudeTransform, KclError> {
args: Args,
) -> Result<Box<ExtrudeTransform>, KclError> {
let surface = extrude_group.get_path_by_name(surface_name).ok_or_else(|| {
KclError::Type(KclErrorDetails {
message: format!(
@ -71,9 +71,9 @@ fn inner_get_extrude_wall_transform(
})
})?;
Ok(ExtrudeTransform {
Ok(Box::new(ExtrudeTransform {
position: surface.get_position(),
rotation: surface.get_rotation(),
meta: extrude_group.meta,
})
}))
}

View File

@ -11,7 +11,7 @@ use crate::{
};
/// Computes the cosine of a number (in radians).
pub fn cos(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn cos(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_cos(num)?;
@ -27,7 +27,7 @@ fn inner_cos(num: f64) -> Result<f64, KclError> {
}
/// Computes the sine of a number (in radians).
pub fn sin(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn sin(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_sin(num)?;
@ -43,7 +43,7 @@ fn inner_sin(num: f64) -> Result<f64, KclError> {
}
/// Computes the tangent of a number (in radians).
pub fn tan(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn tan(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_tan(num)?;
@ -59,7 +59,7 @@ fn inner_tan(num: f64) -> Result<f64, KclError> {
}
/// Return the value of `pi`. Archimedes constant (π).
pub fn pi(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn pi(args: Args) -> Result<MemoryItem, KclError> {
let result = inner_pi()?;
args.make_user_val_from_f64(result)
@ -74,7 +74,7 @@ fn inner_pi() -> Result<f64, KclError> {
}
/// Computes the square root of a number.
pub fn sqrt(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn sqrt(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_sqrt(num)?;
@ -90,7 +90,7 @@ fn inner_sqrt(num: f64) -> Result<f64, KclError> {
}
/// Computes the absolute value of a number.
pub fn abs(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn abs(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_abs(num)?;
@ -106,7 +106,7 @@ fn inner_abs(num: f64) -> Result<f64, KclError> {
}
/// Computes the largest integer less than or equal to a number.
pub fn floor(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn floor(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_floor(num)?;
@ -122,7 +122,7 @@ fn inner_floor(num: f64) -> Result<f64, KclError> {
}
/// Computes the smallest integer greater than or equal to a number.
pub fn ceil(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn ceil(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_ceil(num)?;
@ -138,7 +138,7 @@ fn inner_ceil(num: f64) -> Result<f64, KclError> {
}
/// Computes the minimum of the given arguments.
pub fn min(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn min(args: Args) -> Result<MemoryItem, KclError> {
let nums = args.get_number_array()?;
let result = inner_min(nums);
@ -161,7 +161,7 @@ fn inner_min(args: Vec<f64>) -> f64 {
}
/// Computes the maximum of the given arguments.
pub fn max(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn max(args: Args) -> Result<MemoryItem, KclError> {
let nums = args.get_number_array()?;
let result = inner_max(nums);
@ -184,7 +184,7 @@ fn inner_max(args: Vec<f64>) -> f64 {
}
/// Computes the number to a power.
pub fn pow(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn pow(args: Args) -> Result<MemoryItem, KclError> {
let nums = args.get_number_array()?;
if nums.len() > 2 {
return Err(KclError::Type(KclErrorDetails {
@ -214,7 +214,7 @@ fn inner_pow(num: f64, pow: f64) -> Result<f64, KclError> {
}
/// Computes the arccosine of a number (in radians).
pub fn acos(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn acos(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_acos(num)?;
@ -230,7 +230,7 @@ fn inner_acos(num: f64) -> Result<f64, KclError> {
}
/// Computes the arcsine of a number (in radians).
pub fn asin(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn asin(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_asin(num)?;
@ -246,7 +246,7 @@ fn inner_asin(num: f64) -> Result<f64, KclError> {
}
/// Computes the arctangent of a number (in radians).
pub fn atan(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn atan(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_atan(num)?;
@ -266,7 +266,7 @@ fn inner_atan(num: f64) -> Result<f64, KclError> {
/// The result might not be correctly rounded owing to implementation
/// details; `log2()` can produce more accurate results for base 2,
/// and `log10()` can produce more accurate results for base 10.
pub fn log(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn log(args: Args) -> Result<MemoryItem, KclError> {
let nums = args.get_number_array()?;
if nums.len() > 2 {
return Err(KclError::Type(KclErrorDetails {
@ -299,7 +299,7 @@ fn inner_log(num: f64, base: f64) -> Result<f64, KclError> {
}
/// Computes the base 2 logarithm of the number.
pub fn log2(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn log2(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_log2(num)?;
@ -315,7 +315,7 @@ fn inner_log2(num: f64) -> Result<f64, KclError> {
}
/// Computes the base 10 logarithm of the number.
pub fn log10(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn log10(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_log10(num)?;
@ -331,7 +331,7 @@ fn inner_log10(num: f64) -> Result<f64, KclError> {
}
/// Computes the natural logarithm of the number.
pub fn ln(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn ln(args: Args) -> Result<MemoryItem, KclError> {
let num = args.get_number()?;
let result = inner_ln(num)?;
@ -347,7 +347,7 @@ fn inner_ln(num: f64) -> Result<f64, KclError> {
}
/// Return the value of Eulers number `e`.
pub fn e(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn e(args: Args) -> Result<MemoryItem, KclError> {
let result = inner_e()?;
args.make_user_val_from_f64(result)
@ -362,7 +362,7 @@ fn inner_e() -> Result<f64, KclError> {
}
/// Return the value of `tau`. The full circle constant (τ). Equal to 2π.
pub fn tau(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn tau(args: Args) -> Result<MemoryItem, KclError> {
let result = inner_tau()?;
args.make_user_val_from_f64(result)

View File

@ -10,6 +10,7 @@ use std::collections::HashMap;
use anyhow::Result;
use derive_docs::stdlib;
use kittycad::types::OkWebSocketResponseData;
use parse_display::{Display, FromStr};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
@ -21,7 +22,7 @@ use crate::{
executor::{ExtrudeGroup, MemoryItem, Metadata, SketchGroup, SourceRange},
};
pub type StdFn = fn(&mut Args) -> Result<MemoryItem, KclError>;
pub type StdFn = fn(Args) -> std::pin::Pin<Box<dyn std::future::Future<Output = Result<MemoryItem, KclError>>>>;
pub type FnMap = HashMap<String, StdFn>;
pub struct StdLib {
@ -102,15 +103,15 @@ impl Default for StdLib {
}
}
#[derive(Debug)]
pub struct Args<'a> {
#[derive(Debug, Clone)]
pub struct Args {
pub args: Vec<MemoryItem>,
pub source_range: SourceRange,
engine: &'a mut EngineConnection,
engine: EngineConnection,
}
impl<'a> Args<'a> {
pub fn new(args: Vec<MemoryItem>, source_range: SourceRange, engine: &'a mut EngineConnection) -> Self {
impl Args {
pub fn new(args: Vec<MemoryItem>, source_range: SourceRange, engine: EngineConnection) -> Self {
Self {
args,
source_range,
@ -118,8 +119,12 @@ impl<'a> Args<'a> {
}
}
pub fn send_modeling_cmd(&mut self, id: uuid::Uuid, cmd: kittycad::types::ModelingCmd) -> Result<(), KclError> {
self.engine.send_modeling_cmd(id, self.source_range, cmd)
pub async fn send_modeling_cmd(
&self,
id: uuid::Uuid,
cmd: kittycad::types::ModelingCmd,
) -> Result<OkWebSocketResponseData, KclError> {
self.engine.send_modeling_cmd(id, self.source_range, cmd).await
}
fn make_user_val_from_json(&self, j: serde_json::Value) -> Result<MemoryItem, KclError> {
@ -443,7 +448,7 @@ impl<'a> Args<'a> {
/// Render a model.
// This never actually gets called so this is fine.
pub fn show(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn show<'a>(args: Args) -> Result<MemoryItem, KclError> {
let sketch_group = args.get_sketch_group()?;
inner_show(sketch_group);
@ -457,7 +462,7 @@ pub fn show(args: &mut Args) -> Result<MemoryItem, KclError> {
fn inner_show(_sketch: Box<SketchGroup>) {}
/// Returns the length of the given leg.
pub fn leg_length(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn leg_length(args: Args) -> Result<MemoryItem, KclError> {
let (hypotenuse, leg) = args.get_hypotenuse_leg()?;
let result = inner_leg_length(hypotenuse, leg);
args.make_user_val_from_f64(result)
@ -472,7 +477,7 @@ fn inner_leg_length(hypotenuse: f64, leg: f64) -> f64 {
}
/// Returns the angle of the given leg for x.
pub fn leg_angle_x(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn leg_angle_x(args: Args) -> Result<MemoryItem, KclError> {
let (hypotenuse, leg) = args.get_hypotenuse_leg()?;
let result = inner_leg_angle_x(hypotenuse, leg);
args.make_user_val_from_f64(result)
@ -487,7 +492,7 @@ fn inner_leg_angle_x(hypotenuse: f64, leg: f64) -> f64 {
}
/// Returns the angle of the given leg for y.
pub fn leg_angle_y(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn leg_angle_y(args: Args) -> Result<MemoryItem, KclError> {
let (hypotenuse, leg) = args.get_hypotenuse_leg()?;
let result = inner_leg_angle_y(hypotenuse, leg);
args.make_user_val_from_f64(result)

View File

@ -11,9 +11,9 @@ use crate::{
};
/// Returns the segment end of x.
pub fn segment_end_x(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn segment_end_x(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?;
let result = inner_segment_end_x(&segment_name, sketch_group, args)?;
let result = inner_segment_end_x(&segment_name, sketch_group, args.clone())?;
args.make_user_val_from_f64(result)
}
@ -22,7 +22,7 @@ pub fn segment_end_x(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "segEndX",
}]
fn inner_segment_end_x(segment_name: &str, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> {
fn inner_segment_end_x(segment_name: &str, sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let line = sketch_group.get_base_by_name_or_start(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails {
message: format!(
@ -37,9 +37,9 @@ fn inner_segment_end_x(segment_name: &str, sketch_group: Box<SketchGroup>, args:
}
/// Returns the segment end of y.
pub fn segment_end_y(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn segment_end_y(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?;
let result = inner_segment_end_y(&segment_name, sketch_group, args)?;
let result = inner_segment_end_y(&segment_name, sketch_group, args.clone())?;
args.make_user_val_from_f64(result)
}
@ -48,7 +48,7 @@ pub fn segment_end_y(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "segEndY",
}]
fn inner_segment_end_y(segment_name: &str, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> {
fn inner_segment_end_y(segment_name: &str, sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let line = sketch_group.get_base_by_name_or_start(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails {
message: format!(
@ -63,9 +63,9 @@ fn inner_segment_end_y(segment_name: &str, sketch_group: Box<SketchGroup>, args:
}
/// Returns the last segment of x.
pub fn last_segment_x(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn last_segment_x(args: Args) -> Result<MemoryItem, KclError> {
let sketch_group = args.get_sketch_group()?;
let result = inner_last_segment_x(sketch_group, args)?;
let result = inner_last_segment_x(sketch_group, args.clone())?;
args.make_user_val_from_f64(result)
}
@ -74,7 +74,7 @@ pub fn last_segment_x(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "lastSegX",
}]
fn inner_last_segment_x(sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> {
fn inner_last_segment_x(sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let last_line = sketch_group
.value
.last()
@ -93,9 +93,9 @@ fn inner_last_segment_x(sketch_group: Box<SketchGroup>, args: &mut Args) -> Resu
}
/// Returns the last segment of y.
pub fn last_segment_y(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn last_segment_y(args: Args) -> Result<MemoryItem, KclError> {
let sketch_group = args.get_sketch_group()?;
let result = inner_last_segment_y(sketch_group, args)?;
let result = inner_last_segment_y(sketch_group, args.clone())?;
args.make_user_val_from_f64(result)
}
@ -104,7 +104,7 @@ pub fn last_segment_y(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "lastSegY",
}]
fn inner_last_segment_y(sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> {
fn inner_last_segment_y(sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let last_line = sketch_group
.value
.last()
@ -123,9 +123,9 @@ fn inner_last_segment_y(sketch_group: Box<SketchGroup>, args: &mut Args) -> Resu
}
/// Returns the length of the segment.
pub fn segment_length(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn segment_length(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?;
let result = inner_segment_length(&segment_name, sketch_group, args)?;
let result = inner_segment_length(&segment_name, sketch_group, args.clone())?;
args.make_user_val_from_f64(result)
}
@ -133,7 +133,7 @@ pub fn segment_length(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "segLen",
}]
fn inner_segment_length(segment_name: &str, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> {
fn inner_segment_length(segment_name: &str, sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails {
message: format!(
@ -151,10 +151,10 @@ fn inner_segment_length(segment_name: &str, sketch_group: Box<SketchGroup>, args
}
/// Returns the angle of the segment.
pub fn segment_angle(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn segment_angle(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, sketch_group) = args.get_segment_name_sketch_group()?;
let result = inner_segment_angle(&segment_name, sketch_group, args)?;
let result = inner_segment_angle(&segment_name, sketch_group, args.clone())?;
args.make_user_val_from_f64(result)
}
@ -162,7 +162,7 @@ pub fn segment_angle(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "segAng",
}]
fn inner_segment_angle(segment_name: &str, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<f64, KclError> {
fn inner_segment_angle(segment_name: &str, sketch_group: Box<SketchGroup>, args: Args) -> Result<f64, KclError> {
let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails {
message: format!(
@ -180,9 +180,9 @@ fn inner_segment_angle(segment_name: &str, sketch_group: Box<SketchGroup>, args:
}
/// Returns the angle to match the given length for x.
pub fn angle_to_match_length_x(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn angle_to_match_length_x(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, to, sketch_group) = args.get_segment_name_to_number_sketch_group()?;
let result = inner_angle_to_match_length_x(&segment_name, to, sketch_group, args)?;
let result = inner_angle_to_match_length_x(&segment_name, to, sketch_group, args.clone())?;
args.make_user_val_from_f64(result)
}
@ -194,7 +194,7 @@ fn inner_angle_to_match_length_x(
segment_name: &str,
to: f64,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<f64, KclError> {
let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails {
@ -235,9 +235,9 @@ fn inner_angle_to_match_length_x(
}
/// Returns the angle to match the given length for y.
pub fn angle_to_match_length_y(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn angle_to_match_length_y(args: Args) -> Result<MemoryItem, KclError> {
let (segment_name, to, sketch_group) = args.get_segment_name_to_number_sketch_group()?;
let result = inner_angle_to_match_length_y(&segment_name, to, sketch_group, args)?;
let result = inner_angle_to_match_length_y(&segment_name, to, sketch_group, args.clone())?;
args.make_user_val_from_f64(result)
}
@ -249,7 +249,7 @@ fn inner_angle_to_match_length_y(
segment_name: &str,
to: f64,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<f64, KclError> {
let path = sketch_group.get_path_by_name(segment_name).ok_or_else(|| {
KclError::Type(KclErrorDetails {

View File

@ -33,10 +33,10 @@ pub enum LineToData {
}
/// Draw a line to a point.
pub fn line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn line_to(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (LineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_line_to(data, sketch_group, args)?;
let new_sketch_group = inner_line_to(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -44,10 +44,10 @@ pub fn line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "lineTo",
}]
fn inner_line_to(
async fn inner_line_to(
data: LineToData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
let to = match data {
@ -70,7 +70,8 @@ fn inner_line_to(
relative: false,
},
},
)?;
)
.await?;
let current_path = Path::ToPoint {
base: BasePath {
@ -111,10 +112,10 @@ pub enum AxisLineToData {
}
/// Draw a line to a point on the x-axis.
pub fn x_line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn x_line_to(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AxisLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_x_line_to(data, sketch_group, args)?;
let new_sketch_group = inner_x_line_to(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -122,10 +123,10 @@ pub fn x_line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "xLineTo",
}]
fn inner_x_line_to(
async fn inner_x_line_to(
data: AxisLineToData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
@ -134,16 +135,16 @@ fn inner_x_line_to(
AxisLineToData::Point(data) => LineToData::Point([data, from.y]),
};
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args)?;
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args).await?;
Ok(new_sketch_group)
}
/// Draw a line to a point on the y-axis.
pub fn y_line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn y_line_to(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AxisLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_y_line_to(data, sketch_group, args)?;
let new_sketch_group = inner_y_line_to(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -151,10 +152,10 @@ pub fn y_line_to(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "yLineTo",
}]
fn inner_y_line_to(
async fn inner_y_line_to(
data: AxisLineToData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
@ -163,7 +164,7 @@ fn inner_y_line_to(
AxisLineToData::Point(data) => LineToData::Point([from.x, data]),
};
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args)?;
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args).await?;
Ok(new_sketch_group)
}
@ -184,10 +185,10 @@ pub enum LineData {
}
/// Draw a line.
pub fn line(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn line(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (LineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_line(data, sketch_group, args)?;
let new_sketch_group = inner_line(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -195,7 +196,7 @@ pub fn line(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "line",
}]
fn inner_line(data: LineData, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<SketchGroup>, KclError> {
async fn inner_line(data: LineData, sketch_group: Box<SketchGroup>, args: Args) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
let inner_args = match &data {
LineData::PointWithTag { to, .. } => *to,
@ -217,10 +218,11 @@ fn inner_line(data: LineData, sketch_group: Box<SketchGroup>, args: &mut Args) -
y: delta[1],
z: 0.0,
},
relative: true
relative: true,
},
},
)?;
)
.await?;
let current_path = Path::ToPoint {
base: BasePath {
@ -261,10 +263,10 @@ pub enum AxisLineData {
}
/// Draw a line on the x-axis.
pub fn x_line(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn x_line(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AxisLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_x_line(data, sketch_group, args)?;
let new_sketch_group = inner_x_line(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -272,25 +274,25 @@ pub fn x_line(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "xLine",
}]
fn inner_x_line(
async fn inner_x_line(
data: AxisLineData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let line_data = match data {
AxisLineData::LengthWithTag { length, tag } => LineData::PointWithTag { to: [length, 0.0], tag },
AxisLineData::Length(length) => LineData::Point([length, 0.0]),
};
let new_sketch_group = inner_line(line_data, sketch_group, args)?;
let new_sketch_group = inner_line(line_data, sketch_group, args).await?;
Ok(new_sketch_group)
}
/// Draw a line on the y-axis.
pub fn y_line(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn y_line(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AxisLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_y_line(data, sketch_group, args)?;
let new_sketch_group = inner_y_line(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -298,17 +300,17 @@ pub fn y_line(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "yLine",
}]
fn inner_y_line(
async fn inner_y_line(
data: AxisLineData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let line_data = match data {
AxisLineData::LengthWithTag { length, tag } => LineData::PointWithTag { to: [0.0, length], tag },
AxisLineData::Length(length) => LineData::Point([0.0, length]),
};
let new_sketch_group = inner_line(line_data, sketch_group, args)?;
let new_sketch_group = inner_line(line_data, sketch_group, args).await?;
Ok(new_sketch_group)
}
@ -331,10 +333,10 @@ pub enum AngledLineData {
}
/// Draw an angled line.
pub fn angled_line(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn angled_line(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line(data, sketch_group, args)?;
let new_sketch_group = inner_angled_line(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -342,10 +344,10 @@ pub fn angled_line(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "angledLine",
}]
fn inner_angled_line(
async fn inner_angled_line(
data: AngledLineData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
let (angle, length) = match &data {
@ -360,10 +362,7 @@ fn inner_angled_line(
];
let relative = true;
let to: [f64; 2] = [
from.x + delta[0],
from.y + delta[1],
];
let to: [f64; 2] = [from.x + delta[0], from.y + delta[1]];
let id = uuid::Uuid::new_v4();
@ -393,10 +392,11 @@ fn inner_angled_line(
y: delta[1],
z: 0.0,
},
relative: relative,
relative,
},
},
)?;
)
.await?;
let mut new_sketch_group = sketch_group.clone();
new_sketch_group.value.push(current_path);
@ -404,10 +404,10 @@ fn inner_angled_line(
}
/// Draw an angled line of a given x length.
pub fn angled_line_of_x_length(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn angled_line_of_x_length(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_of_x_length(data, sketch_group, args)?;
let new_sketch_group = inner_angled_line_of_x_length(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -415,10 +415,10 @@ pub fn angled_line_of_x_length(args: &mut Args) -> Result<MemoryItem, KclError>
#[stdlib {
name = "angledLineOfXLength",
}]
fn inner_angled_line_of_x_length(
async fn inner_angled_line_of_x_length(
data: AngledLineData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let (angle, length) = match &data {
AngledLineData::AngleWithTag { angle, length, .. } => (*angle, *length),
@ -435,7 +435,8 @@ fn inner_angled_line_of_x_length(
},
sketch_group,
args,
)?;
)
.await?;
Ok(new_sketch_group)
}
@ -459,10 +460,10 @@ pub enum AngledLineToData {
}
/// Draw an angled line to a given x coordinate.
pub fn angled_line_to_x(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn angled_line_to_x(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_to_x(data, sketch_group, args)?;
let new_sketch_group = inner_angled_line_to_x(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -470,10 +471,10 @@ pub fn angled_line_to_x(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "angledLineToX",
}]
fn inner_angled_line_to_x(
async fn inner_angled_line_to_x(
data: AngledLineToData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
let (angle, x_to) = match &data {
@ -493,15 +494,16 @@ fn inner_angled_line_to_x(
},
sketch_group,
args,
)?;
)
.await?;
Ok(new_sketch_group)
}
/// Draw an angled line of a given y length.
pub fn angled_line_of_y_length(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn angled_line_of_y_length(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_of_y_length(data, sketch_group, args)?;
let new_sketch_group = inner_angled_line_of_y_length(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -510,10 +512,10 @@ pub fn angled_line_of_y_length(args: &mut Args) -> Result<MemoryItem, KclError>
#[stdlib {
name = "angledLineOfYLength",
}]
fn inner_angled_line_of_y_length(
async fn inner_angled_line_of_y_length(
data: AngledLineData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let (angle, length) = match &data {
AngledLineData::AngleWithTag { angle, length, .. } => (*angle, *length),
@ -530,16 +532,17 @@ fn inner_angled_line_of_y_length(
},
sketch_group,
args,
)?;
)
.await?;
Ok(new_sketch_group)
}
/// Draw an angled line to a given y coordinate.
pub fn angled_line_to_y(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn angled_line_to_y(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngledLineToData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_to_y(data, sketch_group, args)?;
let new_sketch_group = inner_angled_line_to_y(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -547,10 +550,10 @@ pub fn angled_line_to_y(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "angledLineToY",
}]
fn inner_angled_line_to_y(
async fn inner_angled_line_to_y(
data: AngledLineToData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
let (angle, y_to) = match &data {
@ -570,7 +573,8 @@ fn inner_angled_line_to_y(
},
sketch_group,
args,
)?;
)
.await?;
Ok(new_sketch_group)
}
@ -591,9 +595,9 @@ pub struct AngeledLineThatIntersectsData {
}
/// Draw an angled line that intersects with a given line.
pub fn angled_line_that_intersects(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn angled_line_that_intersects(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (AngeledLineThatIntersectsData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_angled_line_that_intersects(data, sketch_group, args)?;
let new_sketch_group = inner_angled_line_that_intersects(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -601,10 +605,10 @@ pub fn angled_line_that_intersects(args: &mut Args) -> Result<MemoryItem, KclErr
#[stdlib {
name = "angledLineThatIntersects",
}]
fn inner_angled_line_that_intersects(
async fn inner_angled_line_that_intersects(
data: AngeledLineThatIntersectsData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let intersect_path = sketch_group
.get_path_by_name(&data.intersect_tag)
@ -633,15 +637,15 @@ fn inner_angled_line_that_intersects(
LineToData::Point(to.into())
};
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args)?;
let new_sketch_group = inner_line_to(line_to_data, sketch_group, args).await?;
Ok(new_sketch_group)
}
/// Start a sketch at a given point.
pub fn start_sketch_at(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn start_sketch_at(args: Args) -> Result<MemoryItem, KclError> {
let data: LineData = args.get_data()?;
let sketch_group = inner_start_sketch_at(data, args)?;
let sketch_group = inner_start_sketch_at(data, args).await?;
Ok(MemoryItem::SketchGroup(sketch_group))
}
@ -649,7 +653,7 @@ pub fn start_sketch_at(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "startSketchAt",
}]
fn inner_start_sketch_at(data: LineData, args: &mut Args) -> Result<Box<SketchGroup>, KclError> {
async fn inner_start_sketch_at(data: LineData, args: Args) -> Result<Box<SketchGroup>, KclError> {
let to = match &data {
LineData::PointWithTag { to, .. } => *to,
LineData::Point(to) => *to,
@ -658,7 +662,7 @@ fn inner_start_sketch_at(data: LineData, args: &mut Args) -> Result<Box<SketchGr
let id = uuid::Uuid::new_v4();
let path_id = uuid::Uuid::new_v4();
args.send_modeling_cmd(path_id, ModelingCmd::StartPath {})?;
args.send_modeling_cmd(path_id, ModelingCmd::StartPath {}).await?;
args.send_modeling_cmd(
id,
ModelingCmd::MovePathPen {
@ -669,7 +673,8 @@ fn inner_start_sketch_at(data: LineData, args: &mut Args) -> Result<Box<SketchGr
z: 0.0,
},
},
)?;
)
.await?;
let current_path = BasePath {
from: to,
@ -697,10 +702,10 @@ fn inner_start_sketch_at(data: LineData, args: &mut Args) -> Result<Box<SketchGr
}
/// Close the current sketch.
pub fn close(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn close(args: Args) -> Result<MemoryItem, KclError> {
let sketch_group = args.get_sketch_group()?;
let new_sketch_group = inner_close(sketch_group, args)?;
let new_sketch_group = inner_close(sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -709,7 +714,7 @@ pub fn close(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "close",
}]
fn inner_close(sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<SketchGroup>, KclError> {
async fn inner_close(sketch_group: Box<SketchGroup>, args: Args) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
let to: Point2d = sketch_group.start.from.into();
@ -720,7 +725,8 @@ fn inner_close(sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<Sk
ModelingCmd::ClosePath {
path_id: sketch_group.id,
},
)?;
)
.await?;
let mut new_sketch_group = sketch_group.clone();
new_sketch_group.value.push(Path::ToPoint {
@ -787,10 +793,10 @@ pub enum ArcData {
}
/// Draw an arc.
pub fn arc(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn arc(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (ArcData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_arc(data, sketch_group, args)?;
let new_sketch_group = inner_arc(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -798,7 +804,7 @@ pub fn arc(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "arc",
}]
fn inner_arc(data: ArcData, sketch_group: Box<SketchGroup>, args: &mut Args) -> Result<Box<SketchGroup>, KclError> {
async fn inner_arc(data: ArcData, sketch_group: Box<SketchGroup>, args: Args) -> Result<Box<SketchGroup>, KclError> {
let from: Point2d = sketch_group.get_coords_from_paths()?;
let (center, angle_start, angle_end, radius, end) = match &data {
@ -847,26 +853,8 @@ fn inner_arc(data: ArcData, sketch_group: Box<SketchGroup>, args: &mut Args) ->
relative: false,
},
},
)?;
// TODO: Dont do this (move path pen) - mike
// lets review what the needs are here and see if any existing arc endpoints can accomplish this
// Move the path pen to the end of the arc.
// Since that is where we want to draw the next path.
// TODO: the engine should automatically move the pen to the end of the arc.
// This just seems inefficient.
args.send_modeling_cmd(
id,
ModelingCmd::MovePathPen {
path: sketch_group.id,
to: Point3D {
x: end.x,
y: end.y,
z: 0.0,
},
},
)?;
)
.await?;
let current_path = Path::ToPoint {
base: BasePath {
@ -919,10 +907,10 @@ pub enum BezierData {
}
/// Draw a bezier curve.
pub fn bezier_curve(args: &mut Args) -> Result<MemoryItem, KclError> {
pub async fn bezier_curve(args: Args) -> Result<MemoryItem, KclError> {
let (data, sketch_group): (BezierData, Box<SketchGroup>) = args.get_data_and_sketch_group()?;
let new_sketch_group = inner_bezier_curve(data, sketch_group, args)?;
let new_sketch_group = inner_bezier_curve(data, sketch_group, args).await?;
Ok(MemoryItem::SketchGroup(new_sketch_group))
}
@ -930,10 +918,10 @@ pub fn bezier_curve(args: &mut Args) -> Result<MemoryItem, KclError> {
#[stdlib {
name = "bezierCurve",
}]
fn inner_bezier_curve(
async fn inner_bezier_curve(
data: BezierData,
sketch_group: Box<SketchGroup>,
args: &mut Args,
args: Args,
) -> Result<Box<SketchGroup>, KclError> {
let from = sketch_group.get_coords_from_paths()?;
@ -970,10 +958,11 @@ fn inner_bezier_curve(
y: delta[1],
z: 0.0,
},
relative: relative
relative,
},
},
)?;
)
.await?;
let current_path = Path::ToPoint {
base: BasePath {

View File

@ -0,0 +1,173 @@
use std::str::FromStr;
use anyhow::Result;
use parse_display::{Display, FromStr};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use tower_lsp::lsp_types::SemanticTokenType;
mod tokeniser;
/// The types of tokens.
#[derive(Debug, PartialEq, Eq, Copy, Clone, Deserialize, Serialize, ts_rs::TS, JsonSchema, FromStr, Display)]
#[ts(export)]
#[serde(rename_all = "camelCase")]
#[display(style = "camelCase")]
pub enum TokenType {
/// A number.
Number,
/// A word.
Word,
/// An operator.
Operator,
/// A string.
String,
/// A keyword.
Keyword,
/// A brace.
Brace,
/// Whitespace.
Whitespace,
/// A comma.
Comma,
/// A colon.
Colon,
/// A period.
Period,
/// A double period: `..`.
DoublePeriod,
/// A line comment.
LineComment,
/// A block comment.
BlockComment,
/// A function name.
Function,
}
/// Most KCL tokens correspond to LSP semantic tokens (but not all).
impl TryFrom<TokenType> for SemanticTokenType {
type Error = anyhow::Error;
fn try_from(token_type: TokenType) -> Result<Self> {
Ok(match token_type {
TokenType::Number => Self::NUMBER,
TokenType::Word => Self::VARIABLE,
TokenType::Keyword => Self::KEYWORD,
TokenType::Operator => Self::OPERATOR,
TokenType::String => Self::STRING,
TokenType::LineComment => Self::COMMENT,
TokenType::BlockComment => Self::COMMENT,
TokenType::Function => Self::FUNCTION,
TokenType::Whitespace
| TokenType::Brace
| TokenType::Comma
| TokenType::Colon
| TokenType::Period
| TokenType::DoublePeriod => {
anyhow::bail!("unsupported token type: {:?}", token_type)
}
})
}
}
impl TokenType {
// This is for the lsp server.
pub fn all_semantic_token_types() -> Result<Vec<SemanticTokenType>> {
let mut settings = schemars::gen::SchemaSettings::openapi3();
settings.inline_subschemas = true;
let mut generator = schemars::gen::SchemaGenerator::new(settings);
let schema = TokenType::json_schema(&mut generator);
let schemars::schema::Schema::Object(o) = &schema else {
anyhow::bail!("expected object schema: {:#?}", schema);
};
let Some(subschemas) = &o.subschemas else {
anyhow::bail!("expected subschemas: {:#?}", schema);
};
let Some(one_ofs) = &subschemas.one_of else {
anyhow::bail!("expected one_of: {:#?}", schema);
};
let mut semantic_tokens = vec![];
for one_of in one_ofs {
let schemars::schema::Schema::Object(o) = one_of else {
anyhow::bail!("expected object one_of: {:#?}", one_of);
};
let Some(enum_values) = o.enum_values.as_ref() else {
anyhow::bail!("expected enum values: {:#?}", o);
};
if enum_values.len() > 1 {
anyhow::bail!("expected only one enum value: {:#?}", o);
}
if enum_values.is_empty() {
anyhow::bail!("expected at least one enum value: {:#?}", o);
}
let label = TokenType::from_str(&enum_values[0].to_string().replace('"', ""))?;
if let Ok(semantic_token_type) = SemanticTokenType::try_from(label) {
semantic_tokens.push(semantic_token_type);
}
}
Ok(semantic_tokens)
}
}
#[derive(Debug, PartialEq, Eq, Deserialize, Serialize, Clone, ts_rs::TS)]
#[ts(export)]
pub struct Token {
#[serde(rename = "type")]
pub token_type: TokenType,
/// Offset in the source code where this token begins.
pub start: usize,
/// Offset in the source code where this token ends.
pub end: usize,
pub value: String,
}
impl Token {
pub fn from_range(range: std::ops::Range<usize>, token_type: TokenType, value: String) -> Self {
Self {
start: range.start,
end: range.end,
value,
token_type,
}
}
pub fn is_code_token(&self) -> bool {
!matches!(
self.token_type,
TokenType::Whitespace | TokenType::LineComment | TokenType::BlockComment
)
}
}
impl From<Token> for crate::executor::SourceRange {
fn from(token: Token) -> Self {
Self([token.start, token.end])
}
}
impl From<&Token> for crate::executor::SourceRange {
fn from(token: &Token) -> Self {
Self([token.start, token.end])
}
}
pub fn lexer(s: &str) -> Vec<Token> {
tokeniser::lexer(s).unwrap_or_default()
}
#[cfg(test)]
mod tests {
use super::*;
// We have this as a test so we can ensure it never panics with an unwrap in the server.
#[test]
fn test_token_type_to_semantic_token_type() {
let semantic_types = TokenType::all_semantic_token_types().unwrap();
assert!(!semantic_types.is_empty());
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,744 +0,0 @@
use std::str::FromStr;
use anyhow::Result;
use lazy_static::lazy_static;
use parse_display::{Display, FromStr};
use regex::Regex;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use tower_lsp::lsp_types::SemanticTokenType;
/// The types of tokens.
#[derive(Debug, PartialEq, Eq, Copy, Clone, Deserialize, Serialize, ts_rs::TS, JsonSchema, FromStr, Display)]
#[ts(export)]
#[serde(rename_all = "camelCase")]
#[display(style = "camelCase")]
pub enum TokenType {
/// A number.
Number,
/// A word.
Word,
/// An operator.
Operator,
/// A string.
String,
/// A keyword.
Keyword,
/// A brace.
Brace,
/// Whitespace.
Whitespace,
/// A comma.
Comma,
/// A colon.
Colon,
/// A period.
Period,
/// A double period: `..`.
DoublePeriod,
/// A line comment.
LineComment,
/// A block comment.
BlockComment,
/// A function name.
Function,
}
impl TryFrom<TokenType> for SemanticTokenType {
type Error = anyhow::Error;
fn try_from(token_type: TokenType) -> Result<Self> {
Ok(match token_type {
TokenType::Number => Self::NUMBER,
TokenType::Word => Self::VARIABLE,
TokenType::Keyword => Self::KEYWORD,
TokenType::Operator => Self::OPERATOR,
TokenType::String => Self::STRING,
TokenType::LineComment => Self::COMMENT,
TokenType::BlockComment => Self::COMMENT,
TokenType::Function => Self::FUNCTION,
TokenType::Whitespace
| TokenType::Brace
| TokenType::Comma
| TokenType::Colon
| TokenType::Period
| TokenType::DoublePeriod => {
anyhow::bail!("unsupported token type: {:?}", token_type)
}
})
}
}
impl TokenType {
// This is for the lsp server.
pub fn to_semantic_token_types() -> Result<Vec<SemanticTokenType>> {
let mut settings = schemars::gen::SchemaSettings::openapi3();
settings.inline_subschemas = true;
let mut generator = schemars::gen::SchemaGenerator::new(settings);
let schema = TokenType::json_schema(&mut generator);
let schemars::schema::Schema::Object(o) = &schema else {
anyhow::bail!("expected object schema: {:#?}", schema);
};
let Some(subschemas) = &o.subschemas else {
anyhow::bail!("expected subschemas: {:#?}", schema);
};
let Some(one_ofs) = &subschemas.one_of else {
anyhow::bail!("expected one_of: {:#?}", schema);
};
let mut semantic_tokens = vec![];
for one_of in one_ofs {
let schemars::schema::Schema::Object(o) = one_of else {
anyhow::bail!("expected object one_of: {:#?}", one_of);
};
let Some(enum_values) = o.enum_values.as_ref() else {
anyhow::bail!("expected enum values: {:#?}", o);
};
if enum_values.len() > 1 {
anyhow::bail!("expected only one enum value: {:#?}", o);
}
if enum_values.is_empty() {
anyhow::bail!("expected at least one enum value: {:#?}", o);
}
let label = TokenType::from_str(&enum_values[0].to_string().replace('"', ""))?;
if let Ok(semantic_token_type) = SemanticTokenType::try_from(label) {
semantic_tokens.push(semantic_token_type);
}
}
Ok(semantic_tokens)
}
}
#[derive(Debug, PartialEq, Eq, Deserialize, Serialize, Clone, ts_rs::TS)]
#[ts(export)]
pub struct Token {
#[serde(rename = "type")]
pub token_type: TokenType,
pub start: usize,
pub end: usize,
pub value: String,
}
impl From<Token> for crate::executor::SourceRange {
fn from(token: Token) -> Self {
Self([token.start, token.end])
}
}
impl From<&Token> for crate::executor::SourceRange {
fn from(token: &Token) -> Self {
Self([token.start, token.end])
}
}
lazy_static! {
static ref NUMBER: Regex = Regex::new(r"^(\d+(\.\d*)?|\.\d+)\b").unwrap();
static ref WHITESPACE: Regex = Regex::new(r"\s+").unwrap();
static ref WORD: Regex = Regex::new(r"^[a-zA-Z_][a-zA-Z0-9_]*").unwrap();
// TODO: these should be generated using our struct types for these.
static ref KEYWORD: Regex =
Regex::new(r"^(if|else|for|while|return|break|continue|fn|let|mut|loop|true|false|nil|and|or|not|var|const)\b").unwrap();
static ref OPERATOR: Regex = Regex::new(r"^(>=|<=|==|=>|!= |\|>|\*|\+|-|/|%|=|<|>|\||\^)").unwrap();
static ref STRING: Regex = Regex::new(r#"^"([^"\\]|\\.)*"|'([^'\\]|\\.)*'"#).unwrap();
static ref BLOCK_START: Regex = Regex::new(r"^\{").unwrap();
static ref BLOCK_END: Regex = Regex::new(r"^\}").unwrap();
static ref PARAN_START: Regex = Regex::new(r"^\(").unwrap();
static ref PARAN_END: Regex = Regex::new(r"^\)").unwrap();
static ref ARRAY_START: Regex = Regex::new(r"^\[").unwrap();
static ref ARRAY_END: Regex = Regex::new(r"^\]").unwrap();
static ref COMMA: Regex = Regex::new(r"^,").unwrap();
static ref COLON: Regex = Regex::new(r"^:").unwrap();
static ref PERIOD: Regex = Regex::new(r"^\.").unwrap();
static ref DOUBLE_PERIOD: Regex = Regex::new(r"^\.\.").unwrap();
static ref LINECOMMENT: Regex = Regex::new(r"^//.*").unwrap();
static ref BLOCKCOMMENT: Regex = Regex::new(r"^/\*[\s\S]*?\*/").unwrap();
}
fn is_number(character: &str) -> bool {
NUMBER.is_match(character)
}
fn is_whitespace(character: &str) -> bool {
WHITESPACE.is_match(character)
}
fn is_word(character: &str) -> bool {
WORD.is_match(character)
}
fn is_keyword(character: &str) -> bool {
KEYWORD.is_match(character)
}
fn is_string(character: &str) -> bool {
match STRING.find(character) {
Some(m) => m.start() == 0,
None => false,
}
}
fn is_operator(character: &str) -> bool {
OPERATOR.is_match(character)
}
fn is_block_start(character: &str) -> bool {
BLOCK_START.is_match(character)
}
fn is_block_end(character: &str) -> bool {
BLOCK_END.is_match(character)
}
fn is_paran_start(character: &str) -> bool {
PARAN_START.is_match(character)
}
fn is_paran_end(character: &str) -> bool {
PARAN_END.is_match(character)
}
fn is_array_start(character: &str) -> bool {
ARRAY_START.is_match(character)
}
fn is_array_end(character: &str) -> bool {
ARRAY_END.is_match(character)
}
fn is_comma(character: &str) -> bool {
COMMA.is_match(character)
}
fn is_colon(character: &str) -> bool {
COLON.is_match(character)
}
fn is_double_period(character: &str) -> bool {
DOUBLE_PERIOD.is_match(character)
}
fn is_period(character: &str) -> bool {
PERIOD.is_match(character)
}
fn is_line_comment(character: &str) -> bool {
LINECOMMENT.is_match(character)
}
fn is_block_comment(character: &str) -> bool {
BLOCKCOMMENT.is_match(character)
}
fn match_first(s: &str, regex: &Regex) -> Option<String> {
regex.find(s).map(|the_match| the_match.as_str().to_string())
}
fn make_token(token_type: TokenType, value: &str, start: usize) -> Token {
Token {
token_type,
value: value.to_string(),
start,
end: start + value.len(),
}
}
fn return_token_at_index(s: &str, start_index: usize) -> Option<Token> {
let str_from_index = &s.chars().skip(start_index).collect::<String>();
if is_string(str_from_index) {
return Some(make_token(
TokenType::String,
&match_first(str_from_index, &STRING)?,
start_index,
));
}
let is_line_comment_bool = is_line_comment(str_from_index);
if is_line_comment_bool || is_block_comment(str_from_index) {
return Some(make_token(
if is_line_comment_bool {
TokenType::LineComment
} else {
TokenType::BlockComment
},
&match_first(
str_from_index,
if is_line_comment_bool {
&LINECOMMENT
} else {
&BLOCKCOMMENT
},
)?,
start_index,
));
}
if is_paran_end(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &PARAN_END)?,
start_index,
));
}
if is_paran_start(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &PARAN_START)?,
start_index,
));
}
if is_block_start(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &BLOCK_START)?,
start_index,
));
}
if is_block_end(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &BLOCK_END)?,
start_index,
));
}
if is_array_start(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &ARRAY_START)?,
start_index,
));
}
if is_array_end(str_from_index) {
return Some(make_token(
TokenType::Brace,
&match_first(str_from_index, &ARRAY_END)?,
start_index,
));
}
if is_comma(str_from_index) {
return Some(make_token(
TokenType::Comma,
&match_first(str_from_index, &COMMA)?,
start_index,
));
}
if is_operator(str_from_index) {
return Some(make_token(
TokenType::Operator,
&match_first(str_from_index, &OPERATOR)?,
start_index,
));
}
if is_number(str_from_index) {
return Some(make_token(
TokenType::Number,
&match_first(str_from_index, &NUMBER)?,
start_index,
));
}
if is_keyword(str_from_index) {
return Some(make_token(
TokenType::Keyword,
&match_first(str_from_index, &KEYWORD)?,
start_index,
));
}
if is_word(str_from_index) {
return Some(make_token(
TokenType::Word,
&match_first(str_from_index, &WORD)?,
start_index,
));
}
if is_colon(str_from_index) {
return Some(make_token(
TokenType::Colon,
&match_first(str_from_index, &COLON)?,
start_index,
));
}
if is_double_period(str_from_index) {
return Some(make_token(
TokenType::DoublePeriod,
&match_first(str_from_index, &DOUBLE_PERIOD)?,
start_index,
));
}
if is_period(str_from_index) {
return Some(make_token(
TokenType::Period,
&match_first(str_from_index, &PERIOD)?,
start_index,
));
}
if is_whitespace(str_from_index) {
return Some(make_token(
TokenType::Whitespace,
&match_first(str_from_index, &WHITESPACE)?,
start_index,
));
}
None
}
fn recursively_tokenise(s: &str, current_index: usize, previous_tokens: Vec<Token>) -> Vec<Token> {
if current_index >= s.len() {
return previous_tokens;
}
let token = return_token_at_index(s, current_index);
let Some(token) = token else {
return recursively_tokenise(s, current_index + 1, previous_tokens);
};
let mut new_tokens = previous_tokens;
let token_length = token.value.len();
new_tokens.push(token);
recursively_tokenise(s, current_index + token_length, new_tokens)
}
pub fn lexer(s: &str) -> Vec<Token> {
recursively_tokenise(s, 0, Vec::new())
}
#[cfg(test)]
mod tests {
use pretty_assertions::assert_eq;
use super::*;
#[test]
fn is_number_test() {
assert!(is_number("1"));
assert!(is_number("1 abc"));
assert!(is_number("1.1"));
assert!(is_number("1.1 abc"));
assert!(!is_number("a"));
assert!(is_number("1"));
assert!(is_number(".1"));
assert!(is_number("5?"));
assert!(is_number("5 + 6"));
assert!(is_number("5 + a"));
assert!(is_number("5.5"));
assert!(!is_number("1abc"));
assert!(!is_number("a"));
assert!(!is_number("?"));
assert!(!is_number("?5"));
}
#[test]
fn is_whitespace_test() {
assert!(is_whitespace(" "));
assert!(is_whitespace(" "));
assert!(is_whitespace(" a"));
assert!(is_whitespace("a "));
assert!(!is_whitespace("a"));
assert!(!is_whitespace("?"));
}
#[test]
fn is_word_test() {
assert!(is_word("a"));
assert!(is_word("a "));
assert!(is_word("a5"));
assert!(is_word("a5a"));
assert!(!is_word("5"));
assert!(!is_word("5a"));
assert!(!is_word("5a5"));
}
#[test]
fn is_string_test() {
assert!(is_string("\"\""));
assert!(is_string("\"a\""));
assert!(is_string("\"a\" "));
assert!(is_string("\"a\"5"));
assert!(is_string("'a'5"));
assert!(is_string("\"with escaped \\\" backslash\""));
assert!(!is_string("\""));
assert!(!is_string("\"a"));
assert!(!is_string("a\""));
assert!(!is_string(" \"a\""));
assert!(!is_string("5\"a\""));
assert!(!is_string("a + 'str'"));
assert!(is_string("'c'"));
}
#[test]
fn is_operator_test() {
assert!(is_operator("+"));
assert!(is_operator("+ "));
assert!(is_operator("-"));
assert!(is_operator("<="));
assert!(is_operator("<= "));
assert!(is_operator(">="));
assert!(is_operator(">= "));
assert!(is_operator("> "));
assert!(is_operator("< "));
assert!(is_operator("| "));
assert!(is_operator("|> "));
assert!(is_operator("^ "));
assert!(is_operator("% "));
assert!(is_operator("+* "));
assert!(!is_operator("5 + 5"));
assert!(!is_operator("a"));
assert!(!is_operator("a+"));
assert!(!is_operator("a+5"));
assert!(!is_operator("5a+5"));
assert!(!is_operator(", newVar"));
assert!(!is_operator(","));
}
#[test]
fn is_block_start_test() {
assert!(is_block_start("{"));
assert!(is_block_start("{ "));
assert!(is_block_start("{5"));
assert!(is_block_start("{a"));
assert!(is_block_start("{5 "));
assert!(!is_block_start("5"));
assert!(!is_block_start("5 + 5"));
assert!(!is_block_start("5{ + 5"));
assert!(!is_block_start("a{ + 5"));
assert!(!is_block_start(" { + 5"));
}
#[test]
fn is_block_end_test() {
assert!(is_block_end("}"));
assert!(is_block_end("} "));
assert!(is_block_end("}5"));
assert!(is_block_end("}5 "));
assert!(!is_block_end("5"));
assert!(!is_block_end("5 + 5"));
assert!(!is_block_end("5} + 5"));
assert!(!is_block_end(" } + 5"));
}
#[test]
fn is_paran_start_test() {
assert!(is_paran_start("("));
assert!(is_paran_start("( "));
assert!(is_paran_start("(5"));
assert!(is_paran_start("(5 "));
assert!(is_paran_start("(5 + 5"));
assert!(is_paran_start("(5 + 5)"));
assert!(is_paran_start("(5 + 5) "));
assert!(!is_paran_start("5"));
assert!(!is_paran_start("5 + 5"));
assert!(!is_paran_start("5( + 5)"));
assert!(!is_paran_start(" ( + 5)"));
}
#[test]
fn is_paran_end_test() {
assert!(is_paran_end(")"));
assert!(is_paran_end(") "));
assert!(is_paran_end(")5"));
assert!(is_paran_end(")5 "));
assert!(!is_paran_end("5"));
assert!(!is_paran_end("5 + 5"));
assert!(!is_paran_end("5) + 5"));
assert!(!is_paran_end(" ) + 5"));
}
#[test]
fn is_comma_test() {
assert!(is_comma(","));
assert!(is_comma(", "));
assert!(is_comma(",5"));
assert!(is_comma(",5 "));
assert!(!is_comma("5"));
assert!(!is_comma("5 + 5"));
assert!(!is_comma("5, + 5"));
assert!(!is_comma(" , + 5"));
}
#[test]
fn is_line_comment_test() {
assert!(is_line_comment("//"));
assert!(is_line_comment("// "));
assert!(is_line_comment("//5"));
assert!(is_line_comment("//5 "));
assert!(!is_line_comment("5"));
assert!(!is_line_comment("5 + 5"));
assert!(!is_line_comment("5// + 5"));
assert!(!is_line_comment(" // + 5"));
}
#[test]
fn is_block_comment_test() {
assert!(is_block_comment("/* */"));
assert!(is_block_comment("/***/"));
assert!(is_block_comment("/*5*/"));
assert!(is_block_comment("/*5 */"));
assert!(!is_block_comment("/*"));
assert!(!is_block_comment("5"));
assert!(!is_block_comment("5 + 5"));
assert!(!is_block_comment("5/* + 5"));
assert!(!is_block_comment(" /* + 5"));
assert!(!is_block_comment(
r#" /* and
here
*/
"#
));
}
#[test]
fn make_token_test() {
assert_eq!(
make_token(TokenType::Keyword, "const", 56),
Token {
token_type: TokenType::Keyword,
value: "const".to_string(),
start: 56,
end: 61,
}
);
}
#[test]
fn return_token_at_index_test() {
assert_eq!(
return_token_at_index("const", 0),
Some(Token {
token_type: TokenType::Keyword,
value: "const".to_string(),
start: 0,
end: 5,
})
);
assert_eq!(
return_token_at_index(" 4554", 2),
Some(Token {
token_type: TokenType::Number,
value: "4554".to_string(),
start: 2,
end: 6,
})
);
}
#[test]
fn lexer_test() {
assert_eq!(
lexer("const a=5"),
vec![
Token {
token_type: TokenType::Keyword,
value: "const".to_string(),
start: 0,
end: 5,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 5,
end: 6,
},
Token {
token_type: TokenType::Word,
value: "a".to_string(),
start: 6,
end: 7,
},
Token {
token_type: TokenType::Operator,
value: "=".to_string(),
start: 7,
end: 8,
},
Token {
token_type: TokenType::Number,
value: "5".to_string(),
start: 8,
end: 9,
},
]
);
assert_eq!(
lexer("54 + 22500 + 6"),
vec![
Token {
token_type: TokenType::Number,
value: "54".to_string(),
start: 0,
end: 2,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 2,
end: 3,
},
Token {
token_type: TokenType::Operator,
value: "+".to_string(),
start: 3,
end: 4,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 4,
end: 5,
},
Token {
token_type: TokenType::Number,
value: "22500".to_string(),
start: 5,
end: 10,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 10,
end: 11,
},
Token {
token_type: TokenType::Operator,
value: "+".to_string(),
start: 11,
end: 12,
},
Token {
token_type: TokenType::Whitespace,
value: " ".to_string(),
start: 12,
end: 13,
},
Token {
token_type: TokenType::Number,
value: "6".to_string(),
start: 13,
end: 14,
},
]
);
}
// We have this as a test so we can ensure it never panics with an unwrap in the server.
#[test]
fn test_token_type_to_semantic_token_type() {
let semantic_types = TokenType::to_semantic_token_types().unwrap();
assert!(!semantic_types.is_empty());
}
#[test]
fn test_lexer_negative_word() {
assert_eq!(
lexer("-legX"),
vec![
Token {
token_type: TokenType::Operator,
value: "-".to_string(),
start: 0,
end: 1,
},
Token {
token_type: TokenType::Word,
value: "legX".to_string(),
start: 1,
end: 5,
},
]
);
}
}

View File

@ -21,11 +21,12 @@ pub async fn execute_wasm(
let program: kcl_lib::ast::types::Program = serde_json::from_str(program_str).map_err(|e| e.to_string())?;
let mut mem: kcl_lib::executor::ProgramMemory = serde_json::from_str(memory_str).map_err(|e| e.to_string())?;
let mut engine = kcl_lib::engine::EngineConnection::new(manager)
let engine = kcl_lib::engine::EngineConnection::new(manager)
.await
.map_err(|e| format!("{:?}", e))?;
let memory = kcl_lib::executor::execute(program, &mut mem, kcl_lib::executor::BodyType::Root, &mut engine)
let memory = kcl_lib::executor::execute(program, &mut mem, kcl_lib::executor::BodyType::Root, &engine)
.await
.map_err(String::from)?;
// The serde-wasm-bindgen does not work here because of weird HashMap issues so we use the
// gloo-serialize crate instead.
@ -83,13 +84,13 @@ pub fn deserialize_files(data: &[u8]) -> Result<JsValue, JsError> {
// test for this function and by extension lexer are done in javascript land src/lang/tokeniser.test.ts
#[wasm_bindgen]
pub fn lexer_js(js: &str) -> Result<JsValue, JsError> {
let tokens = kcl_lib::tokeniser::lexer(js);
let tokens = kcl_lib::token::lexer(js);
Ok(JsValue::from_serde(&tokens)?)
}
#[wasm_bindgen]
pub fn parse_js(js: &str) -> Result<JsValue, String> {
let tokens = kcl_lib::tokeniser::lexer(js);
let tokens = kcl_lib::token::lexer(js);
let parser = kcl_lib::parser::Parser::new(tokens);
let program = parser.ast().map_err(String::from)?;
// The serde-wasm-bindgen does not work here because of weird HashMap issues so we use the
@ -148,7 +149,7 @@ pub async fn lsp_run(config: ServerConfig) -> Result<(), JsValue> {
let stdlib_signatures = get_signatures_from_stdlib(&stdlib).map_err(|e| e.to_string())?;
// We can unwrap here because we know the tokeniser is valid, since
// we have a test for it.
let token_types = kcl_lib::tokeniser::TokenType::to_semantic_token_types().unwrap();
let token_types = kcl_lib::token::TokenType::all_semantic_token_types().unwrap();
let (service, socket) = LspService::new(|client| Backend {
client,

View File

@ -0,0 +1,310 @@
const svg = startSketchAt([0, 0])
|> lineTo([2.52, -26.04], %) // MoveAbsolute
|> lineTo([2.52, -25.2], %) // VerticalLineAbsolute
|> lineTo([0.84, -25.2], %) // HorizontalLineAbsolute
|> lineTo([0.84, -24.36], %) // VerticalLineAbsolute
|> lineTo([0, -24.36], %) // HorizontalLineAbsolute
|> lineTo([0, -6.72], %) // VerticalLineAbsolute
|> lineTo([0.84, -6.72], %) // HorizontalLineAbsolute
|> lineTo([0.84, -5.88], %) // VerticalLineAbsolute
|> lineTo([1.68, -5.88], %) // HorizontalLineAbsolute
|> lineTo([1.68, -5.04], %) // VerticalLineAbsolute
|> lineTo([2.52, -5.04], %) // HorizontalLineAbsolute
|> lineTo([2.52, -4.2], %) // VerticalLineAbsolute
|> lineTo([3.36, -4.2], %) // HorizontalLineAbsolute
|> lineTo([3.36, -3.36], %) // VerticalLineAbsolute
|> lineTo([17.64, -3.36], %) // HorizontalLineAbsolute
|> lineTo([17.64, -4.2], %) // VerticalLineAbsolute
|> lineTo([18.48, -4.2], %) // HorizontalLineRelative
|> lineTo([18.48, -5.04], %) // VerticalLineHorizonal
|> lineTo([19.32, -5.04], %) // HorizontalLineRelative
|> lineTo([19.32, -5.88], %) // VerticalLineHorizonal
|> lineTo([20.16, -5.88], %) // HorizontalLineRelative
|> lineTo([20.16, -6.72], %) // VerticalLineAbsolute
|> lineTo([21, -6.72], %) // HorizontalLineAbsolute
|> lineTo([21, -24.36], %) // VerticalLineHorizonal
|> lineTo([20.16, -24.36], %) // HorizontalLineRelative
|> lineTo([20.16, -25.2], %) // VerticalLineHorizonal
|> lineTo([18.48, -25.2], %) // HorizontalLineRelative
|> lineTo([18.48, -26.04], %) // VerticalLineHorizonal
|> lineTo([15.96, -26.04], %) // HorizontalLineRelative
|> lineTo([15.96, -26.88], %) // VerticalLineHorizonal
|> lineTo([16.8, -26.88], %) // HorizontalLineRelative
|> lineTo([16.8, -28.56], %) // VerticalLineHorizonal
|> lineTo([11.76, -28.56], %) // HorizontalLineAbsolute
|> lineTo([11.76, -26.88], %) // VerticalLineAbsolute
|> lineTo([12.6, -26.88], %) // HorizontalLineAbsolute
|> lineTo([12.6, -26.04], %) // VerticalLineAbsolute
|> lineTo([8.4, -26.04], %) // HorizontalLineAbsolute
|> lineTo([8.4, -26.88], %) // VerticalLineHorizonal
|> lineTo([9.24, -26.88], %) // HorizontalLineRelative
|> lineTo([9.24, -28.56], %) // VerticalLineHorizonal
|> lineTo([4.2, -28.56], %) // HorizontalLineAbsolute
|> lineTo([4.2, -26.88], %) // VerticalLineHorizonal
|> lineTo([5.04, -26.88], %) // HorizontalLineRelative
|> lineTo([5.04, -26.04], %) // VerticalLineHorizonal
|> lineTo([0.839996, -20.58], %) // MoveRelative
|> lineTo([0.839996, -24.36], %) // VerticalLineHorizonal
|> lineTo([2.52, -24.36], %) // HorizontalLineAbsolute
|> lineTo([2.52, -25.2], %) // VerticalLineHorizonal
|> lineTo([18.48, -25.2], %) // HorizontalLineRelative
|> lineTo([18.48, -24.36], %) // VerticalLineHorizonal
|> lineTo([20.16, -24.36], %) // HorizontalLineRelative
|> lineTo([20.16, -20.58], %) // VerticalLineAbsolute
// StopAbsolute
|> lineTo([7.56, -24.36], %) // MoveAbsolute
|> lineTo([7.56, -22.68], %) // VerticalLineHorizonal
|> lineTo([13.44, -22.68], %) // HorizontalLineRelative
|> lineTo([13.44, -24.36], %) // VerticalLineHorizonal
|> lineTo([1.68, -22.68], %) // MoveRelative
|> lineTo([1.68, -21.84], %) // VerticalLineHorizonal
|> lineTo([5.88, -21.84], %) // HorizontalLineRelative
|> lineTo([5.88, -22.68], %) // VerticalLineHorizonal
|> lineTo([3.36, -24.36], %) // MoveRelative
|> lineTo([3.36, -23.52], %) // VerticalLineHorizonal
|> lineTo([5.88, -23.52], %) // HorizontalLineRelative
|> lineTo([5.88, -24.36], %) // VerticalLineHorizonal
|> lineTo([15.12, -22.68], %) // MoveRelative
|> lineTo([15.12, -21.84], %) // VerticalLineHorizonal
|> lineTo([15.959999999999999, -21.84], %) // HorizontalLineRelative
|> lineTo([15.959999999999999, -22.68], %) // VerticalLineHorizonal
|> lineTo([16.8, -22.68], %) // MoveRelative
|> lineTo([16.8, -21.84], %) // VerticalLineHorizonal
|> lineTo([17.64, -21.84], %) // HorizontalLineRelative
|> lineTo([17.64, -22.68], %) // VerticalLineHorizonal
|> lineTo([18.48, -22.68], %) // MoveRelative
|> lineTo([18.48, -21.84], %) // VerticalLineHorizonal
|> lineTo([19.32, -21.84], %) // HorizontalLineRelative
|> lineTo([19.32, -22.68], %) // VerticalLineHorizonal
|> lineTo([15.12, -24.36], %) // MoveRelative
|> lineTo([15.12, -23.52], %) // VerticalLineHorizonal
|> lineTo([17.64, -23.52], %) // HorizontalLineRelative
|> lineTo([17.64, -24.36], %) // VerticalLineHorizonal
|> lineTo([18.48, -5.88], %) // MoveAbsolute
|> lineTo([18.48, -5.04], %) // VerticalLineAbsolute
|> lineTo([17.64, -5.04], %) // HorizontalLineAbsolute
|> lineTo([17.64, -4.2], %) // VerticalLineAbsolute
|> lineTo([3.36, -4.2], %) // HorizontalLineAbsolute
|> lineTo([3.36, -5.04], %) // VerticalLineAbsolute
|> lineTo([2.52, -5.04], %) // HorizontalLineAbsolute
|> lineTo([2.52, -5.88], %) // VerticalLineAbsolute
|> lineTo([1.68, -5.88], %) // HorizontalLineAbsolute
|> lineTo([1.68, -6.72], %) // VerticalLineAbsolute
|> lineTo([0.839996, -6.72], %) // HorizontalLineAbsolute
|> lineTo([0.839996, -8.4], %) // VerticalLineAbsolute
|> lineTo([20.16, -8.4], %) // HorizontalLineAbsolute
|> lineTo([20.16, -6.72], %) // VerticalLineAbsolute
|> lineTo([19.32, -6.72], %) // HorizontalLineAbsolute
|> lineTo([19.32, -5.88], %) // VerticalLineAbsolute
|> lineTo([20.16, -7.56], %) // MoveAbsolute
|> lineTo([0.839996, -7.56], %) // HorizontalLineAbsolute
|> lineTo([0.839996, -19.32], %) // VerticalLineAbsolute
|> lineTo([20.16, -19.32], %) // HorizontalLineAbsolute
|> lineTo([3.36, -10.08], %) // MoveAbsolute
|> lineTo([3.36, -9.24001], %) // VerticalLineAbsolute
|> lineTo([17.64, -9.24001], %) // HorizontalLineAbsolute
|> lineTo([17.64, -10.08], %) // VerticalLineAbsolute
|> lineTo([18.48, -10.08], %) // HorizontalLineRelative
|> lineTo([18.48, -16.8], %) // VerticalLineHorizonal
|> lineTo([17.64, -16.8], %) // HorizontalLineRelative
|> lineTo([17.64, -17.64], %) // VerticalLineHorizonal
|> lineTo([3.36, -17.64], %) // HorizontalLineAbsolute
|> lineTo([3.36, -16.8], %) // VerticalLineAbsolute
|> lineTo([2.52, -16.8], %) // HorizontalLineAbsolute
|> lineTo([2.52, -10.080000000000002], %) // VerticalLineHorizonal
|> lineTo([13.44, -10.92], %) // MoveRelative
|> lineTo([13.44, -10.08], %) // VerticalLineHorizonal
|> lineTo([15.12, -10.08], %) // HorizontalLineRelative
|> lineTo([15.12, -13.44], %) // VerticalLineHorizonal
|> lineTo([14.28, -13.44], %) // HorizontalLineRelative
|> lineTo([9.24, -13.44], %) // MoveRelative
|> lineTo([11.76, -13.44], %) // HorizontalLineRelative
|> lineTo([11.76, -14.28], %) // VerticalLineHorizonal
|> lineTo([10.92, -14.28], %) // HorizontalLineRelative here
|> lineTo([10.92, -15.959999999999999], %) // VerticalLineHorizonal
|> lineTo([13.44, -15.959999999999999], %) // HorizontalLineRelative
|> lineTo([13.44, -15.12], %) // VerticalLineHorizonal
|> lineTo([14.28, -15.12], %) // HorizontalLineRelative
|> lineTo([14.28, -15.959999999999999], %) // VerticalLineHorizonal
|> lineTo([13.44, -15.959999999999999], %) // HorizontalLineAbsolute
|> lineTo([13.44, -16.8], %) // VerticalLineAbsolute
|> lineTo([7.56, -16.8], %) // HorizontalLineAbsolute
|> lineTo([7.56, -15.96], %) // VerticalLineAbsolute
|> lineTo([6.72, -15.96], %) // HorizontalLineAbsolute
|> lineTo([6.72, -15.120000000000001], %) // VerticalLineHorizonal
|> lineTo([7.56, -15.120000000000001], %) // HorizontalLineRelative
|> lineTo([7.56, -15.96], %) // VerticalLineHorizonal
|> lineTo([10.08, -15.96], %) // HorizontalLineRelative
|> lineTo([10.08, -14.28], %) // VerticalLineAbsolute
|> lineTo([9.24, -14.28], %) // HorizontalLineAbsolute
|> lineTo([7.56, -12.6], %) // MoveAbsolute
|> lineTo([7.56, -11.76], %) // VerticalLineAbsolute
|> lineTo([5.04, -11.76], %) // HorizontalLineAbsolute
|> lineTo([5.04, -12.6], %) // VerticalLineAbsolute
|> lineTo([4.2, -12.6], %) // HorizontalLineAbsolute
|> lineTo([4.2, -11.76], %) // VerticalLineHorizonal
|> lineTo([5.04, -11.76], %) // HorizontalLineRelative
|> lineTo([5.04, -10.92], %) // VerticalLineHorizonal
|> lineTo([7.5600000000000005, -10.92], %) // HorizontalLineRelative
|> lineTo([7.5600000000000005, -11.76], %) // VerticalLineHorizonal
|> lineTo([8.4, -11.76], %) // HorizontalLineAbsolute
|> lineTo([8.4, -12.6], %) // VerticalLineHorizonal
|> lineTo([3.36, -5.88], %) // MoveAbsolute
|> lineTo([3.36, -5.04], %) // VerticalLineAbsolute
|> lineTo([4.2, -5.04], %) // HorizontalLineAbsolute
|> lineTo([4.2, -3.36], %) // VerticalLineAbsolute
|> lineTo([5.04, -3.36], %) // HorizontalLineAbsolute
|> lineTo([5.04, -1.68], %) // VerticalLineAbsolute
|> lineTo([5.88, -1.68], %) // HorizontalLineAbsolute
|> lineTo([5.88, -0.83999599], %) // VerticalLineAbsolute
|> lineTo([6.72, -0.83999599], %) // HorizontalLineAbsolute
|> lineTo([6.72, -1.68], %) // VerticalLineAbsolute
|> lineTo([7.56, -1.68], %) // HorizontalLineAbsolute
|> lineTo([7.56, -3.36], %) // VerticalLineAbsolute
|> lineTo([8.4, -3.36], %) // HorizontalLineAbsolute
|> lineTo([8.4, -5.04], %) // VerticalLineHorizonal
|> lineTo([9.24, -5.04], %) // HorizontalLineRelative
|> lineTo([9.24, -5.88], %) // VerticalLineHorizonal
|> lineTo([17.64, -5.04], %) // MoveAbsolute
|> lineTo([17.64, -5.88], %) // VerticalLineAbsolute
|> lineTo([11.76, -5.88], %) // HorizontalLineAbsolute
|> lineTo([11.76, -5.04], %) // VerticalLineAbsolute
|> lineTo([12.6, -5.04], %) // HorizontalLineAbsolute
|> lineTo([12.6, -3.36], %) // VerticalLineAbsolute
|> lineTo([13.44, -3.36], %) // HorizontalLineRelative
|> lineTo([13.44, -1.68], %) // VerticalLineAbsolute
|> lineTo([14.28, -1.68], %) // HorizontalLineRelative
|> lineTo([14.28, -0.83999599], %) // VerticalLineAbsolute
|> lineTo([15.12, -0.83999599], %) // HorizontalLineRelative
|> lineTo([15.12, -1.68], %) // VerticalLineAbsolute
|> lineTo([15.959999999999999, -1.68], %) // HorizontalLineRelative
|> lineTo([15.959999999999999, -3.36], %) // VerticalLineHorizonal
|> lineTo([16.8, -3.36], %) // HorizontalLineRelative
|> lineTo([16.8, -5.04], %) // VerticalLineHorizonal
|> lineTo([13.44, -1.68], %) // MoveAbsolute
|> lineTo([13.44, -0], %) // VerticalLineAbsolute
|> lineTo([15.959999999999999, -0], %) // HorizontalLineRelative
|> lineTo([15.959999999999999, -1.68], %) // VerticalLineHorizonal
|> lineTo([16.8, -1.68], %) // HorizontalLineRelative
|> lineTo([16.8, -3.36], %) // VerticalLineHorizonal
|> lineTo([17.64, -3.36], %) // HorizontalLineRelative
|> lineTo([17.64, -4.62], %) // VerticalLineAbsolute
|> lineTo([16.8, -4.62], %) // HorizontalLineAbsolute
|> lineTo([16.8, -3.36], %) // VerticalLineAbsolute
|> lineTo([15.96, -3.36], %) // HorizontalLineAbsolute
|> lineTo([15.96, -1.68], %) // VerticalLineAbsolute
|> lineTo([15.12, -1.68], %) // HorizontalLineAbsolute
|> lineTo([15.12, -0.83999999], %) // VerticalLineAbsolute
|> lineTo([14.28, -0.83999999], %) // HorizontalLineAbsolute
|> lineTo([14.28, -1.68], %) // VerticalLineAbsolute
|> lineTo([13.44, -1.68], %) // HorizontalLineAbsolute
|> lineTo([13.44, -3.36], %) // VerticalLineAbsolute
|> lineTo([12.6, -3.36], %) // HorizontalLineAbsolute
|> lineTo([12.6, -4.62], %) // VerticalLineAbsolute
|> lineTo([11.76, -4.62], %) // HorizontalLineAbsolute
|> lineTo([11.76, -3.36], %) // VerticalLineAbsolute
|> lineTo([12.6, -3.36], %) // HorizontalLineAbsolute
|> lineTo([12.6, -1.68], %) // VerticalLineAbsolute
|> lineTo([5.04, -1.68], %) // MoveAbsolute
|> lineTo([5.04, -0], %) // VerticalLineAbsolute
|> lineTo([7.56, -0], %) // HorizontalLineAbsolute
|> lineTo([7.56, -1.68], %) // VerticalLineAbsolute
|> lineTo([8.4, -1.68], %) // HorizontalLineAbsolute
|> lineTo([8.4, -3.36], %) // VerticalLineAbsolute
|> lineTo([9.24, -3.36], %) // HorizontalLineAbsolute
|> lineTo([9.24, -4.62], %) // VerticalLineAbsolute
|> lineTo([8.4, -4.62], %) // HorizontalLineAbsolute
|> lineTo([8.4, -3.36], %) // VerticalLineAbsolute
|> lineTo([7.56, -3.36], %) // HorizontalLineAbsolute
|> lineTo([7.56, -1.68], %) // VerticalLineAbsolute
|> lineTo([6.72, -1.68], %) // HorizontalLineAbsolute
|> lineTo([6.72, -0.83999999], %) // VerticalLineAbsolute
|> lineTo([5.88, -0.83999999], %) // HorizontalLineAbsolute
|> lineTo([5.88, -1.68], %) // VerticalLineAbsolute
|> lineTo([5.04, -1.68], %) // HorizontalLineAbsolute
|> lineTo([5.04, -3.36], %) // VerticalLineAbsolute
|> lineTo([4.2, -3.36], %) // HorizontalLineAbsolute
|> lineTo([4.2, -4.62], %) // VerticalLineAbsolute
|> lineTo([3.36, -4.62], %) // HorizontalLineAbsolute
|> lineTo([3.36, -3.36], %) // VerticalLineAbsolute
|> lineTo([4.2, -3.36], %) // HorizontalLineAbsolute
|> lineTo([4.2, -1.68], %) // VerticalLineAbsolute
|> lineTo([13.44, -5.88], %) // MoveAbsolute
|> lineTo([13.44, -5.04], %) // VerticalLineAbsolute
|> lineTo([14.28, -5.04], %) // HorizontalLineRelative
|> lineTo([14.28, -4.2], %) // VerticalLineAbsolute
|> lineTo([15.12, -4.2], %) // HorizontalLineRelative
|> lineTo([15.12, -5.04], %) // VerticalLineHorizonal
|> lineTo([15.959999999999999, -5.04], %) // HorizontalLineRelative
|> lineTo([15.959999999999999, -5.88], %) // VerticalLineHorizonal
|> lineTo([5.88, -5.04], %) // MoveAbsolute
|> lineTo([5.88, -4.2], %) // VerticalLineAbsolute
|> lineTo([6.72, -4.2], %) // HorizontalLineAbsolute
|> lineTo([6.72, -5.04], %) // VerticalLineAbsolute
|> lineTo([7.56, -5.04], %) // HorizontalLineAbsolute
|> lineTo([7.56, -5.88], %) // VerticalLineAbsolute
|> lineTo([5.04, -5.88], %) // HorizontalLineAbsolute
|> lineTo([5.04, -5.04], %) // VerticalLineAbsolute
|> lineTo([17.64, -5.88], %) // MoveAbsolute
|> lineTo([17.64, -5.04], %) // VerticalLineAbsolute
|> lineTo([16.8, -5.04], %) // HorizontalLineAbsolute
|> lineTo([16.8, -4.2], %) // VerticalLineAbsolute
|> lineTo([17.64, -4.2], %) // HorizontalLineRelative
|> lineTo([17.64, -5.04], %) // VerticalLineHorizonal
|> lineTo([18.48, -5.04], %) // HorizontalLineRelative
|> lineTo([18.48, -5.88], %) // VerticalLineHorizonal
|> lineTo([3.36, -5.04], %) // MoveAbsolute
|> lineTo([3.36, -5.88], %) // VerticalLineAbsolute
|> lineTo([2.52, -5.88], %) // HorizontalLineAbsolute
|> lineTo([2.52, -5.04], %) // VerticalLineAbsolute
|> lineTo([3.36, -5.04], %) // HorizontalLineAbsolute
|> lineTo([3.36, -4.2], %) // VerticalLineAbsolute
|> lineTo([4.2, -4.2], %) // HorizontalLineAbsolute
|> lineTo([4.2, -5.04], %) // VerticalLineHorizonal
|> lineTo([8.4, -4.2], %) // MoveRelative
|> lineTo([9.24, -4.2], %) // HorizontalLineRelative
|> lineTo([9.24, -5.04], %) // VerticalLineHorizonal
|> lineTo([10.08, -5.04], %) // HorizontalLineRelative
|> lineTo([10.08, -5.88], %) // VerticalLineAbsolute
|> lineTo([9.24, -5.88], %) // HorizontalLineAbsolute
|> lineTo([9.24, -5.04], %) // VerticalLineAbsolute
|> lineTo([8.4, -5.04], %) // HorizontalLineAbsolute
|> lineTo([11.76, -4.2], %) // MoveAbsolute
|> lineTo([12.6, -4.2], %) // HorizontalLineAbsolute
|> lineTo([12.6, -5.04], %) // VerticalLineAbsolute
|> lineTo([11.76, -5.04], %) // HorizontalLineAbsolute
|> lineTo([11.76, -5.88], %) // VerticalLineAbsolute
|> lineTo([10.92, -5.88], %) // HorizontalLineAbsolute
|> lineTo([10.92, -5.04], %) // VerticalLineAbsolute
|> lineTo([11.76, -5.04], %) // HorizontalLineRelative
|> lineTo([14.28, -10.92], %) // MoveRelative
|> lineTo([13.44, -10.92], %) // HorizontalLineRelative
|> lineTo([13.44, -13.44], %) // VerticalLineHorizonal
|> lineTo([14.28, -13.44], %) // HorizontalLineRelative
|> close(%)
show(svg)

View File

@ -466,5 +466,5 @@ const svg = startSketchAt([0, 0])
|> bezierCurve({ control1: [-4, -3], control2: [-2.66, -3.67], to: [-3.32, -3.34] }, %) // CubicBezierAbsolute
|> bezierCurve({ control1: [0, -2], control2: [-2.68, -2.67], to: [-1.36, -2.34] }, %) // CubicBezierAbsolute
|> bezierCurve({ control1: [0, -0], control2: [0, -1.34], to: [0, -0.68] }, %) // CubicBezierAbsolute
|> close(%);
show(svg);
|> close(%)
show(svg)

View File

@ -15,6 +15,7 @@ async fn execute_and_snapshot(code: &str) -> Result<image::DynamicImage> {
// For file conversions we need this to be long.
.timeout(std::time::Duration::from_secs(600))
.connect_timeout(std::time::Duration::from_secs(60))
.connection_verbose(true)
.tcp_keepalive(std::time::Duration::from_secs(600))
.http1_only();
@ -31,16 +32,16 @@ async fn execute_and_snapshot(code: &str) -> Result<image::DynamicImage> {
// Create a temporary file to write the output to.
let output_file = std::env::temp_dir().join(format!("kcl_output_{}.png", uuid::Uuid::new_v4()));
let tokens = kcl_lib::tokeniser::lexer(code);
let tokens = kcl_lib::token::lexer(code);
let parser = kcl_lib::parser::Parser::new(tokens);
let program = parser.ast()?;
let mut mem: kcl_lib::executor::ProgramMemory = Default::default();
let mut engine = kcl_lib::engine::EngineConnection::new(ws).await?;
let _ = kcl_lib::executor::execute(program, &mut mem, kcl_lib::executor::BodyType::Root, &mut engine)?;
let engine = kcl_lib::engine::EngineConnection::new(ws).await?;
let _ = kcl_lib::executor::execute(program, &mut mem, kcl_lib::executor::BodyType::Root, &engine).await?;
// Send a snapshot request to the engine.
let resp = engine
.send_modeling_cmd_get_response(
.send_modeling_cmd(
uuid::Uuid::new_v4(),
kcl_lib::executor::SourceRange::default(),
kittycad::types::ModelingCmd::TakeSnapshot {
@ -174,6 +175,14 @@ async fn serial_test_execute_pipes_on_pipes() {
twenty_twenty::assert_image("tests/executor/outputs/pipes_on_pipes.png", &result, 1.0);
}
#[tokio::test(flavor = "multi_thread")]
async fn serial_test_execute_kittycad_svg() {
let code = include_str!("inputs/kittycad_svg.kcl");
let result = execute_and_snapshot(code).await.unwrap();
twenty_twenty::assert_image("tests/executor/outputs/kittycad_svg.png", &result, 1.0);
}
#[tokio::test(flavor = "multi_thread")]
async fn test_member_expression_sketch_group() {
let code = r#"fn cube = (pos, scale) => {
@ -201,3 +210,45 @@ show(b2)"#;
1.0,
);
}
#[tokio::test(flavor = "multi_thread")]
async fn test_close_arc() {
let code = r#"const center = [0,0]
const radius = 40
const height = 3
const body = startSketchAt([center[0]+radius, center[1]])
|> arc({angle_end: 360, angle_start: 0, radius: radius}, %)
|> close(%)
|> extrude(height, %)
show(body)"#;
let result = execute_and_snapshot(code).await.unwrap();
twenty_twenty::assert_image("tests/executor/outputs/close_arc.png", &result, 1.0);
}
#[tokio::test(flavor = "multi_thread")]
async fn test_negative_args() {
let code = r#"const width = 5
const height = 10
const length = 12
fn box = (sk1, sk2, scale) => {
const boxSketch = startSketchAt([sk1, sk2])
|> line([0, scale], %)
|> line([scale, 0], %)
|> line([0, -scale], %)
|> close(%)
|> extrude(scale, %)
return boxSketch
}
box(0, 0, 5)
box(10, 23, 8)
let thing = box(-12, -15, 10)
box(-20, -5, 10)"#;
let result = execute_and_snapshot(code).await.unwrap();
twenty_twenty::assert_image("tests/executor/outputs/negative_args.png", &result, 1.0);
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 95 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 90 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 69 KiB

After

Width:  |  Height:  |  Size: 69 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 78 KiB

View File

@ -33,17 +33,13 @@ async fn setup(code: &str, name: &str) -> Result<(EngineConnection, Program, uui
.commands_ws(None, None, None, None, Some(false))
.await?;
let tokens = kcl_lib::tokeniser::lexer(code);
let tokens = kcl_lib::token::lexer(code);
let parser = kcl_lib::parser::Parser::new(tokens);
let program = parser.ast()?;
let mut mem: kcl_lib::executor::ProgramMemory = Default::default();
let mut engine = kcl_lib::engine::EngineConnection::new(ws).await?;
let memory = kcl_lib::executor::execute(
program.clone(),
&mut mem,
kcl_lib::executor::BodyType::Root,
&mut engine,
)?;
let engine = kcl_lib::engine::EngineConnection::new(ws).await?;
let memory =
kcl_lib::executor::execute(program.clone(), &mut mem, kcl_lib::executor::BodyType::Root, &engine).await?;
// We need to get the sketch ID.
// Get the sketch group ID from memory.
@ -53,38 +49,44 @@ async fn setup(code: &str, name: &str) -> Result<(EngineConnection, Program, uui
let sketch_id = sketch_group.id;
let plane_id = uuid::Uuid::new_v4();
engine.send_modeling_cmd(
plane_id,
SourceRange::default(),
ModelingCmd::MakePlane {
clobber: false,
origin: Point3D { x: 0.0, y: 0.0, z: 0.0 },
size: 60.0,
x_axis: Point3D { x: 1.0, y: 0.0, z: 0.0 },
y_axis: Point3D { x: 0.0, y: 1.0, z: 0.0 },
},
)?;
engine
.send_modeling_cmd(
plane_id,
SourceRange::default(),
ModelingCmd::MakePlane {
clobber: false,
origin: Point3D { x: 0.0, y: 0.0, z: 0.0 },
size: 60.0,
x_axis: Point3D { x: 1.0, y: 0.0, z: 0.0 },
y_axis: Point3D { x: 0.0, y: 1.0, z: 0.0 },
},
)
.await?;
// Enter sketch mode.
// We can't get control points without being in sketch mode.
// You can however get path info without sketch mode.
engine.send_modeling_cmd(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::SketchModeEnable {
animated: false,
ortho: true,
plane_id,
},
)?;
engine
.send_modeling_cmd(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::SketchModeEnable {
animated: false,
ortho: true,
plane_id,
},
)
.await?;
// Enter edit mode.
// We can't get control points of an existing sketch without being in edit mode.
engine.send_modeling_cmd(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::EditModeEnter { target: sketch_id },
)?;
engine
.send_modeling_cmd(
uuid::Uuid::new_v4(),
SourceRange::default(),
ModelingCmd::EditModeEnter { target: sketch_id },
)
.await?;
Ok((engine, program, sketch_id))
}

View File

@ -1530,10 +1530,10 @@
resolved "https://registry.yarnpkg.com/@juggle/resize-observer/-/resize-observer-3.4.0.tgz#08d6c5e20cf7e4cc02fd181c4b0c225cd31dbb60"
integrity sha512-dfLbk+PwWvFzSxwk3n5ySL0hfBog779o8h68wK/7/APo/7cgyWp5jcXockbxdk5kFRkbeXWm4Fbi9FrdN381sA==
"@kittycad/lib@^0.0.37":
version "0.0.37"
resolved "https://registry.yarnpkg.com/@kittycad/lib/-/lib-0.0.37.tgz#ec4f6c4fb5d06402a19339f3374036b6582d2265"
integrity sha512-P8p9FeLV79/0Lfd0RioBta1drzhmpROnU4YV38+zsAA4LhibQCTjeekRkxVvHztGumPxz9pPsAeeLJyuz2RWKQ==
"@kittycad/lib@^0.0.39":
version "0.0.39"
resolved "https://registry.yarnpkg.com/@kittycad/lib/-/lib-0.0.39.tgz#e548acf5ff7d45a1f1ec9ad2c61ddcfc30d159b7"
integrity sha512-cB4wNjsKTMpJUn/kMK3qtkVAqB1csSglqThe+bj02nC1kWTB1XgYxksooc/Gzl1MoK1/n0OPQcbOb7Tojb836A==
dependencies:
node-fetch "3.3.2"
openapi-types "^12.0.0"
@ -1883,10 +1883,10 @@
resolved "https://registry.yarnpkg.com/@types/crypto-js/-/crypto-js-4.1.1.tgz#602859584cecc91894eb23a4892f38cfa927890d"
integrity sha512-BG7fQKZ689HIoc5h+6D2Dgq1fABRa0RbBWKBd9SP/MVRVXROflpm5fhwyATX5duFmbStzyzyycPB8qUYKDH3NA==
"@types/debounce@^1.2.1":
version "1.2.1"
resolved "https://registry.yarnpkg.com/@types/debounce/-/debounce-1.2.1.tgz#79b65710bc8b6d44094d286aecf38e44f9627852"
integrity sha512-epMsEE85fi4lfmJUH/89/iV/LI+F5CvNIvmgs5g5jYFPfhO2S/ae8WSsLOKWdwtoaZw9Q2IhJ4tQ5tFCcS/4HA==
"@types/debounce-promise@^3.1.6":
version "3.1.6"
resolved "https://registry.yarnpkg.com/@types/debounce-promise/-/debounce-promise-3.1.6.tgz#873e838574011095ed0debf73eed3538e1261d75"
integrity sha512-DowqK95aku+OxMCeG2EQSeXeGeE8OCwLpMsUfIbP7hMF8Otj8eQXnzpwdtIKV+UqQBtkMcF6vbi4Otbh8P/wmg==
"@types/eslint@^8.4.5":
version "8.44.1"
@ -2806,6 +2806,11 @@ data-uri-to-buffer@^4.0.0:
resolved "https://registry.yarnpkg.com/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz#d8feb2b2881e6a4f58c2e08acfd0e2834e26222e"
integrity sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A==
debounce-promise@^3.1.2:
version "3.1.2"
resolved "https://registry.yarnpkg.com/debounce-promise/-/debounce-promise-3.1.2.tgz#320fb8c7d15a344455cd33cee5ab63530b6dc7c5"
integrity sha512-rZHcgBkbYavBeD9ej6sP56XfG53d51CD4dnaw989YX/nZ/ZJfgRx/9ePKmTNiUiyQvh4mtrMoS3OAWW+yoYtpg==
debug@^3.2.7:
version "3.2.7"
resolved "https://registry.yarnpkg.com/debug/-/debug-3.2.7.tgz#72580b7e9145fb39b6676f9c5e5fb100b934179a"