Compare commits

...

27 Commits

Author SHA1 Message Date
6ddbb7a31d Fix unknown property error message on sketches and solids (#7632)
* Fix unknown property error message on sketches and solids

* Add suggestion for common case

* Move test code in file to avoid conflict
2025-07-01 18:37:01 +00:00
051bb0589e KCL: rectangle function (#7616)
* KCL test for rectangle

* Rectangle function

* Rectangle helper tests

* Rectangle helper

* Fix clippy lints

* Update docs

* fmt

* Fix bug

* fmt

* Fix doc comments

* Update generated docs

---------

Co-authored-by: Jonathan Tran <jonnytran@gmail.com>
2025-07-01 14:26:04 -04:00
7f9851ae28 [Chore]: Added url-checker, updated circular-deps, documented new static analysis .txt pattern (#7442)
* fix: ignoring url checker files

* fix: url checker

* fix: auto fmt and cleanup

* fix: moving the bash scripts and known files into the scripts repo

* fix: removed all url_results and made it be all in memory

* fix: fixed the newline issue

* fix: url checking as a step to the static analysis

* fix: removed old code

* chore: writing documentation on our static checker pattern

* fix: updating the docs more to be clearer

* fix: copy and paste without understanding requirements of ci cd dependencies? do i need all of these?

* fix: updating

* fix: I thought this got in?

* Update CONTRIBUTING.md

Co-authored-by: Jace Browning <jacebrowning@gmail.com>

---------

Co-authored-by: Jace Browning <jacebrowning@gmail.com>
2025-07-01 13:01:42 -05:00
fbcbb341e2 KCL: Add planeOf function to stdlib (#7643)
Gets the plane a face lies on, if any.

Closes #7642
2025-07-01 17:42:12 +00:00
4a080d1583 Bump @types/node from 22.15.32 to 24.0.3 in /packages/codemirror-lsp-client in the major group (#7490)
Bump @types/node in /packages/codemirror-lsp-client in the major group

Bumps the major group in /packages/codemirror-lsp-client with 1 update: [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node).


Updates `@types/node` from 22.15.32 to 24.0.3
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

---
updated-dependencies:
- dependency-name: "@types/node"
  dependency-version: 24.0.3
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pierre Jacquier <pierrejacquier39@gmail.com>
2025-07-01 12:01:45 -04:00
85c721fb49 Add display of units for calculated KCL values (#7619)
* Add display of units in UI modals with calculated KCL values

* Fix command bar display to handle units

* Add display of units in the command bar

* Fix more cases of NaN from units

* Fix to support explicit plus for exponent in scientific notation

* Fix display in autocomplete

* Change to parseFloat to be more resilient

* Add e2e test for command bar

* Change an existing test to use explicit inline units

* Fix case when input string can't be parsed
2025-06-30 15:26:45 -04:00
27af2d08a3 Bump the patch group in /rust with 3 updates (#7575)
* Bump the patch group in /rust with 3 updates

Bumps the patch group in /rust with 3 updates: [toml_edit](https://github.com/toml-rs/toml), [syn](https://github.com/dtolnay/syn) and [toml](https://github.com/toml-rs/toml).


Updates `toml_edit` from 0.22.26 to 0.22.27
- [Commits](https://github.com/toml-rs/toml/compare/v0.22.26...v0.22.27)

Updates `syn` from 2.0.103 to 2.0.104
- [Release notes](https://github.com/dtolnay/syn/releases)
- [Commits](https://github.com/dtolnay/syn/compare/2.0.103...2.0.104)

Updates `toml` from 0.8.22 to 0.8.23
- [Commits](https://github.com/toml-rs/toml/compare/toml-v0.8.22...toml-v0.8.23)

---
updated-dependencies:
- dependency-name: toml_edit
  dependency-version: 0.22.27
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: patch
- dependency-name: syn
  dependency-version: 2.0.104
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: patch
- dependency-name: toml
  dependency-version: 0.8.23
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Trigger CI

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jonathan Tran <jonnytran@gmail.com>
2025-06-30 12:09:43 -04:00
fb8b975b5e Bump esbuild from 0.25.2 to 0.25.3 in the security group across 1 directory (#6681)
Bump esbuild in the security group across 1 directory

Bumps the security group with 1 update in the / directory: [esbuild](https://github.com/evanw/esbuild).


Updates `esbuild` from 0.25.2 to 0.25.3
- [Release notes](https://github.com/evanw/esbuild/releases)
- [Changelog](https://github.com/evanw/esbuild/blob/main/CHANGELOG.md)
- [Commits](https://github.com/evanw/esbuild/compare/v0.25.2...v0.25.3)

---
updated-dependencies:
- dependency-name: esbuild
  dependency-version: 0.25.3
  dependency-type: direct:development
  dependency-group: security
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-06-30 15:40:33 +00:00
62d8d45a58 Bump the major group across 1 directory with 4 updates (#7572)
* Bump the major group across 1 directory with 4 updates

Bumps the major group with 4 updates in the / directory: [dawidd6/action-download-artifact](https://github.com/dawidd6/action-download-artifact), [runs-on/action](https://github.com/runs-on/action), [actions/create-github-app-token](https://github.com/actions/create-github-app-token) and [astral-sh/setup-uv](https://github.com/astral-sh/setup-uv).


Updates `dawidd6/action-download-artifact` from 7 to 11
- [Release notes](https://github.com/dawidd6/action-download-artifact/releases)
- [Commits](https://github.com/dawidd6/action-download-artifact/compare/v7...v11)

Updates `runs-on/action` from 1 to 2
- [Release notes](https://github.com/runs-on/action/releases)
- [Commits](https://github.com/runs-on/action/compare/v1...v2)

Updates `actions/create-github-app-token` from 1 to 2
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](https://github.com/actions/create-github-app-token/compare/v1...v2)

Updates `astral-sh/setup-uv` from 5 to 6
- [Release notes](https://github.com/astral-sh/setup-uv/releases)
- [Commits](https://github.com/astral-sh/setup-uv/compare/v5...v6)

---
updated-dependencies:
- dependency-name: dawidd6/action-download-artifact
  dependency-version: '11'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: major
- dependency-name: runs-on/action
  dependency-version: '2'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: major
- dependency-name: actions/create-github-app-token
  dependency-version: '2'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: major
- dependency-name: astral-sh/setup-uv
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
  dependency-group: major
...

Signed-off-by: dependabot[bot] <support@github.com>

* bump

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Max Ammann <max.ammann@zoo.dev>
2025-06-30 10:07:54 -04:00
ae3440df0a Use proper envs for Rust functions (#7623) 2025-06-29 07:03:36 -05:00
af658c909d Enterprise plans should not have the upgrade button (#7628)
* Enterprise plans should not have the upgrade button
Fixes #7627

* Move the check to BillingDialog

* Hide home box and change bool check

* Add component tests

* Clean up
2025-06-28 12:03:41 -04:00
7ec11d23c8 Capitalize labels in the native file menu (#7639) 2025-06-28 12:00:47 -04:00
30000a1eac Fix the vertical alignment on the temporary workspace label (#7638) 2025-06-28 13:23:47 +00:00
cb3b45747c Change AI to ML because cringe (#7636)
Change AI to ML
2025-06-27 19:44:05 +00:00
fe66310f2d Update output to match main (#7630) 2025-06-27 17:29:27 +00:00
fefb6cfe87 Rerun sim tests after #7608 (#7624) 2025-06-27 10:21:30 -04:00
0f8375cbb4 [BUG] offsetPlane back-side startSketchOn (#7622)
offset backPlane selection bug
2025-06-27 21:36:42 +10:00
107adc77b3 Allow the origin of rotation to be specified (#7608)
* pass axis origin to endpoint

* fmt

* fix lint errors

* update sim tests with new transform endpoint

* added missed files

* revert cargo.toml

* implement review requests

* fmt

* revert unnecessary custom origin
2025-06-27 00:38:18 +01:00
4356885aa2 Bump cargo to 1.88; 2024 edition for kcl-lib (#7618)
This is a big one because the edition changes a fair number of things.
2025-06-26 22:02:54 +00:00
6a2027cd51 Keep subtract solids selection to one until engine supports multiple (#7617)
Follow up to #7614
2025-06-26 21:36:05 +00:00
f49cf8281c Allow point-and-click Substract to take in multiple solids and tools (#7614)
* Allow point-and-click Substract to take in multiple tools
Fixes #7612

* Change target to solids for consistency and make it support multi select too

* Improve err message

* Update src/lang/modifyAst/boolean.ts

Co-authored-by: graphite-app[bot] <96075541+graphite-app[bot]@users.noreply.github.com>

* Update src/lang/modifyAst/boolean.ts

Co-authored-by: Kurt Hutten <k.hutten@protonmail.ch>

* Good bot

* Reduce array to single value if len 1

* Remove console.log

---------

Co-authored-by: graphite-app[bot] <96075541+graphite-app[bot]@users.noreply.github.com>
Co-authored-by: Kurt Hutten <k.hutten@protonmail.ch>
2025-06-26 16:43:53 -04:00
7de27c648f Revoke token when logging out (#7493)
* Revoke token when logging out

* extract OAUTH2_DEVICE_CLIENT_ID

* Update snapshots

* Update snapshots

* try fix

* try fix

* Move client id to `@src/lib/constants`

---------

Co-authored-by: Jonathan Tran <jonnytran@gmail.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Pierre Jacquier <pierrejacquier39@gmail.com>
Co-authored-by: Pierre Jacquier <pierre@zoo.dev>
2025-06-26 15:24:16 -04:00
344fb6f84d Hide Helix arguments that should have been hidden, plus other flow fixes (#7606)
* Make sure mode-related args are hidden in point-and-click commands after option args change
Fixes #7589

* WIP improving helix flows and fixing tests

* Fix 2 more tests

* Add test step for opt arg

* Fix last helix test

* Clean up tests, hope to fix CI
2025-06-26 14:12:36 -04:00
df808b3e58 Bump google-github-actions/auth from 2.1.8 to 2.1.10 in the patch group across 1 directory (#6566)
Bump google-github-actions/auth in the patch group across 1 directory

Bumps the patch group with 1 update in the / directory: [google-github-actions/auth](https://github.com/google-github-actions/auth).


Updates `google-github-actions/auth` from 2.1.8 to 2.1.10
- [Release notes](https://github.com/google-github-actions/auth/releases)
- [Changelog](https://github.com/google-github-actions/auth/blob/main/CHANGELOG.md)
- [Commits](https://github.com/google-github-actions/auth/compare/v2.1.8...v2.1.10)

---
updated-dependencies:
- dependency-name: google-github-actions/auth
  dependency-version: 2.1.10
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pierre Jacquier <pierrejacquier39@gmail.com>
2025-06-25 13:33:09 -04:00
e1ab6bbc48 Swap "must" for "should" in identifier casing lint (#7604)
Draft: Swap "must" for "should" in identifier casing lint
2025-06-25 12:48:02 -04:00
0a1f35b89a Release KCL 83 (#7603) 2025-06-25 10:42:57 -04:00
78278d6889 Force the samples manifest to be updated (#7591)
* Force the samples manifest to be updated

* Skip manifest generation on Windows

This results in non-POSIX paths in the manifest.
2025-06-25 10:42:39 -04:00
258 changed files with 9099 additions and 2343 deletions

View File

@ -43,7 +43,7 @@ jobs:
- name: Download Wasm Cache - name: Download Wasm Cache
id: download-wasm id: download-wasm
if: ${{ github.event_name == 'pull_request' && steps.filter.outputs.rust == 'false' }} if: ${{ github.event_name == 'pull_request' && steps.filter.outputs.rust == 'false' }}
uses: dawidd6/action-download-artifact@v7 uses: dawidd6/action-download-artifact@v11
continue-on-error: true continue-on-error: true
with: with:
github_token: ${{secrets.GITHUB_TOKEN}} github_token: ${{secrets.GITHUB_TOKEN}}
@ -362,7 +362,7 @@ jobs:
- name: Authenticate to Google Cloud - name: Authenticate to Google Cloud
if: ${{ env.IS_STAGING == 'true' }} if: ${{ env.IS_STAGING == 'true' }}
uses: 'google-github-actions/auth@v2.1.8' uses: 'google-github-actions/auth@v2.1.10'
with: with:
credentials_json: '${{ secrets.GOOGLE_CLOUD_DL_SA }}' credentials_json: '${{ secrets.GOOGLE_CLOUD_DL_SA }}'

View File

@ -25,8 +25,8 @@ jobs:
- runner=8cpu-linux-x64 - runner=8cpu-linux-x64
- extras=s3-cache - extras=s3-cache
steps: steps:
- uses: runs-on/action@v1 - uses: runs-on/action@v2
- uses: actions/create-github-app-token@v1 - uses: actions/create-github-app-token@v2
id: app-token id: app-token
with: with:
app-id: ${{ secrets.MODELING_APP_GH_APP_ID }} app-id: ${{ secrets.MODELING_APP_GH_APP_ID }}
@ -149,8 +149,8 @@ jobs:
partitionIndex: [1, 2, 3, 4, 5, 6] partitionIndex: [1, 2, 3, 4, 5, 6]
partitionTotal: [6] partitionTotal: [6]
steps: steps:
- uses: runs-on/action@v1 - uses: runs-on/action@v2
- uses: actions/create-github-app-token@v1 - uses: actions/create-github-app-token@v2
id: app-token id: app-token
with: with:
app-id: ${{ secrets.MODELING_APP_GH_APP_ID }} app-id: ${{ secrets.MODELING_APP_GH_APP_ID }}
@ -207,8 +207,8 @@ jobs:
- runner=32cpu-linux-x64 - runner=32cpu-linux-x64
- extras=s3-cache - extras=s3-cache
steps: steps:
- uses: runs-on/action@v1 - uses: runs-on/action@v2
- uses: actions/create-github-app-token@v1 - uses: actions/create-github-app-token@v2
id: app-token id: app-token
with: with:
app-id: ${{ secrets.MODELING_APP_GH_APP_ID }} app-id: ${{ secrets.MODELING_APP_GH_APP_ID }}

View File

@ -46,7 +46,7 @@ jobs:
- name: Download Wasm cache - name: Download Wasm cache
id: download-wasm id: download-wasm
if: ${{ github.event_name != 'schedule' && steps.filter.outputs.rust == 'false' }} if: ${{ github.event_name != 'schedule' && steps.filter.outputs.rust == 'false' }}
uses: dawidd6/action-download-artifact@v7 uses: dawidd6/action-download-artifact@v11
continue-on-error: true continue-on-error: true
with: with:
github_token: ${{secrets.GITHUB_TOKEN}} github_token: ${{secrets.GITHUB_TOKEN}}
@ -110,7 +110,7 @@ jobs:
steps: steps:
- uses: actions/create-github-app-token@v1 - uses: actions/create-github-app-token@v2
id: app-token id: app-token
with: with:
app-id: ${{ secrets.MODELING_APP_GH_APP_ID }} app-id: ${{ secrets.MODELING_APP_GH_APP_ID }}
@ -230,7 +230,7 @@ jobs:
steps: steps:
- uses: actions/create-github-app-token@v1 - uses: actions/create-github-app-token@v2
id: app-token id: app-token
with: with:
app-id: ${{ secrets.MODELING_APP_GH_APP_ID }} app-id: ${{ secrets.MODELING_APP_GH_APP_ID }}

View File

@ -20,7 +20,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: actions/create-github-app-token@v1 - uses: actions/create-github-app-token@v2
id: app-token id: app-token
with: with:
# required # required

View File

@ -328,7 +328,7 @@ jobs:
mkdir -p releases/language-server/${{ env.TAG }} mkdir -p releases/language-server/${{ env.TAG }}
cp -r build/* releases/language-server/${{ env.TAG }} cp -r build/* releases/language-server/${{ env.TAG }}
- name: "Authenticate to Google Cloud" - name: "Authenticate to Google Cloud"
uses: "google-github-actions/auth@v2.1.8" uses: "google-github-actions/auth@v2.1.10"
with: with:
credentials_json: "${{ secrets.GOOGLE_CLOUD_DL_SA }}" credentials_json: "${{ secrets.GOOGLE_CLOUD_DL_SA }}"
- name: Set up Cloud SDK - name: Set up Cloud SDK

View File

@ -113,7 +113,7 @@ jobs:
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Install uv - name: Install uv
uses: astral-sh/setup-uv@v5 uses: astral-sh/setup-uv@v6
- uses: actions-rust-lang/setup-rust-toolchain@v1 - uses: actions-rust-lang/setup-rust-toolchain@v1
- uses: taiki-e/install-action@just - uses: taiki-e/install-action@just
- name: Run tests - name: Run tests
@ -130,7 +130,7 @@ jobs:
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Install the latest version of uv - name: Install the latest version of uv
uses: astral-sh/setup-uv@v5 uses: astral-sh/setup-uv@v6
- name: Install codespell - name: Install codespell
run: | run: |
uv venv .venv uv venv .venv
@ -161,7 +161,7 @@ jobs:
with: with:
path: rust/kcl-python-bindings path: rust/kcl-python-bindings
- name: Install the latest version of uv - name: Install the latest version of uv
uses: astral-sh/setup-uv@v5 uses: astral-sh/setup-uv@v6
- name: do uv things - name: do uv things
run: | run: |
cd rust/kcl-python-bindings cd rust/kcl-python-bindings

View File

@ -108,7 +108,7 @@ jobs:
run: npm run files:set-notes run: npm run files:set-notes
- name: Authenticate to Google Cloud - name: Authenticate to Google Cloud
uses: 'google-github-actions/auth@v2.1.8' uses: 'google-github-actions/auth@v2.1.10'
with: with:
credentials_json: '${{ secrets.GOOGLE_CLOUD_DL_SA }}' credentials_json: '${{ secrets.GOOGLE_CLOUD_DL_SA }}'

View File

@ -120,6 +120,36 @@ jobs:
- run: npm run circular-deps:diff - run: npm run circular-deps:diff
npm-url-checker:
runs-on: ubuntu-latest
needs: npm-build-wasm
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version-file: '.nvmrc'
cache: 'npm'
- run: npm install
- name: Download all artifacts
uses: actions/download-artifact@v4
- name: Copy prepared wasm
run: |
ls -R prepared-wasm
cp prepared-wasm/kcl_wasm_lib_bg.wasm public
mkdir rust/kcl-wasm-lib/pkg
cp prepared-wasm/kcl_wasm_lib* rust/kcl-wasm-lib/pkg
- name: Copy prepared ts-rs bindings
run: |
ls -R prepared-ts-rs-bindings
mkdir rust/kcl-lib/bindings
cp -r prepared-ts-rs-bindings/* rust/kcl-lib/bindings/
- run: npm run url-checker:diff
python-codespell: python-codespell:
runs-on: ubuntu-22.04 runs-on: ubuntu-22.04
steps: steps:

2
.gitignore vendored
View File

@ -87,4 +87,4 @@ venv
.vscode-test .vscode-test
.biome/ .biome/
.million .million

View File

@ -235,6 +235,47 @@ To display logging (to the terminal or console) set `ZOO_LOG=1`. This will log s
To enable memory metrics, build with `--features dhat-heap`. To enable memory metrics, build with `--features dhat-heap`.
## Running scripts
There are multiple scripts under the folder path `./scripts` which can be used in various settings.
### Pattern for a static file, npm run commands, and CI-CD checks
If you want to implement a static checker follow this pattern. Two static checkers we have are circular dependency checks in our typescript code and url checker to see if any hard coded URL is the typescript application 404s. We have a set of known files in `./scripts/known/*.txt` which is the baseline.
If you improve the baseline, run the overwrite command and commit the new smaller baseline. Try not to make the baseline bigger, the CI CD will complain.
These baselines are to hold us to higher standards and help implement automated testing against the repository
#### Output result to stdout
- `npm run circular-deps`
- `npm run url-checker`
- create a `<name>.sh` file that will run the static checker then output the result to `stdout`
#### Overwrite result to known .txt file on disk
If the application needs to overwrite the known file on disk use this pattern. This known .txt file will be source controlled as the baseline
- `npm run circular-deps:overwrite`
- `npm run url-checker:overwrite`
#### Diff baseline and current
These commands will write a /tmp/ file on disk and compare it to the known file in the repository. This command will also be used in the CI CD pipeline for automated checks
- create a `diff-<name>.sh` file that is the script to diff your tmp file to the baseline
e.g. `diff-url-checker.sh`
```bash
#!/bin/bash
set -euo pipefail
npm run url-checker > /tmp/urls.txt
diff --ignore-blank-lines -w /tmp/urls.txt ./scripts/known/urls.txt
```
- `npm run circular-deps:diff`
- `npm run url-checker:diff`
## Proposing changes ## Proposing changes
Before you submit a contribution PR to this repo, please ensure that: Before you submit a contribution PR to this repo, please ensure that:

View File

@ -62,7 +62,10 @@ else
endif endif
public/kcl-samples/manifest.json: $(KCL_SOURCES) public/kcl-samples/manifest.json: $(KCL_SOURCES)
ifndef WINDOWS
cd rust/kcl-lib && EXPECTORATE=overwrite cargo test generate_manifest cd rust/kcl-lib && EXPECTORATE=overwrite cargo test generate_manifest
@ touch $@
endif
.vite/build/main.js: $(REACT_SOURCES) $(TYPESCRIPT_SOURCES) $(VITE_SOURCES) .vite/build/main.js: $(REACT_SOURCES) $(TYPESCRIPT_SOURCES) $(VITE_SOURCES)
npm run tronb:vite:dev npm run tronb:vite:dev

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -67,10 +67,12 @@ layout: manual
* [`patternCircular2d`](/docs/kcl-std/functions/std-sketch-patternCircular2d) * [`patternCircular2d`](/docs/kcl-std/functions/std-sketch-patternCircular2d)
* [`patternLinear2d`](/docs/kcl-std/functions/std-sketch-patternLinear2d) * [`patternLinear2d`](/docs/kcl-std/functions/std-sketch-patternLinear2d)
* [`patternTransform2d`](/docs/kcl-std/functions/std-sketch-patternTransform2d) * [`patternTransform2d`](/docs/kcl-std/functions/std-sketch-patternTransform2d)
* [`planeOf`](/docs/kcl-std/functions/std-sketch-planeOf)
* [`polygon`](/docs/kcl-std/functions/std-sketch-polygon) * [`polygon`](/docs/kcl-std/functions/std-sketch-polygon)
* [`profileStart`](/docs/kcl-std/functions/std-sketch-profileStart) * [`profileStart`](/docs/kcl-std/functions/std-sketch-profileStart)
* [`profileStartX`](/docs/kcl-std/functions/std-sketch-profileStartX) * [`profileStartX`](/docs/kcl-std/functions/std-sketch-profileStartX)
* [`profileStartY`](/docs/kcl-std/functions/std-sketch-profileStartY) * [`profileStartY`](/docs/kcl-std/functions/std-sketch-profileStartY)
* [`rectangle`](/docs/kcl-std/functions/std-sketch-rectangle)
* [`revolve`](/docs/kcl-std/functions/std-sketch-revolve) * [`revolve`](/docs/kcl-std/functions/std-sketch-revolve)
* [`segAng`](/docs/kcl-std/functions/std-sketch-segAng) * [`segAng`](/docs/kcl-std/functions/std-sketch-segAng)
* [`segEnd`](/docs/kcl-std/functions/std-sketch-segEnd) * [`segEnd`](/docs/kcl-std/functions/std-sketch-segEnd)

View File

@ -32,10 +32,12 @@ This module contains functions for creating and manipulating sketches, and makin
* [`patternCircular2d`](/docs/kcl-std/functions/std-sketch-patternCircular2d) * [`patternCircular2d`](/docs/kcl-std/functions/std-sketch-patternCircular2d)
* [`patternLinear2d`](/docs/kcl-std/functions/std-sketch-patternLinear2d) * [`patternLinear2d`](/docs/kcl-std/functions/std-sketch-patternLinear2d)
* [`patternTransform2d`](/docs/kcl-std/functions/std-sketch-patternTransform2d) * [`patternTransform2d`](/docs/kcl-std/functions/std-sketch-patternTransform2d)
* [`planeOf`](/docs/kcl-std/functions/std-sketch-planeOf)
* [`polygon`](/docs/kcl-std/functions/std-sketch-polygon) * [`polygon`](/docs/kcl-std/functions/std-sketch-polygon)
* [`profileStart`](/docs/kcl-std/functions/std-sketch-profileStart) * [`profileStart`](/docs/kcl-std/functions/std-sketch-profileStart)
* [`profileStartX`](/docs/kcl-std/functions/std-sketch-profileStartX) * [`profileStartX`](/docs/kcl-std/functions/std-sketch-profileStartX)
* [`profileStartY`](/docs/kcl-std/functions/std-sketch-profileStartY) * [`profileStartY`](/docs/kcl-std/functions/std-sketch-profileStartY)
* [`rectangle`](/docs/kcl-std/functions/std-sketch-rectangle)
* [`revolve`](/docs/kcl-std/functions/std-sketch-revolve) * [`revolve`](/docs/kcl-std/functions/std-sketch-revolve)
* [`segAng`](/docs/kcl-std/functions/std-sketch-segAng) * [`segAng`](/docs/kcl-std/functions/std-sketch-segAng)
* [`segEnd`](/docs/kcl-std/functions/std-sketch-segEnd) * [`segEnd`](/docs/kcl-std/functions/std-sketch-segEnd)

View File

@ -12,7 +12,7 @@ test.describe('Point and click for boolean workflows', () => {
}, },
{ {
name: 'subtract', name: 'subtract',
code: 'subtract([extrude001], tools = [extrude006])', code: 'subtract(extrude001, tools = extrude006)',
}, },
{ {
name: 'intersect', name: 'intersect',
@ -81,6 +81,8 @@ test.describe('Point and click for boolean workflows', () => {
if (operationName !== 'subtract') { if (operationName !== 'subtract') {
// should down shift key to select multiple objects // should down shift key to select multiple objects
await page.keyboard.down('Shift') await page.keyboard.down('Shift')
} else {
await cmdBar.progressCmdBar()
} }
// Select second object // Select second object
@ -103,8 +105,8 @@ test.describe('Point and click for boolean workflows', () => {
await cmdBar.expectState({ await cmdBar.expectState({
stage: 'review', stage: 'review',
headerArguments: { headerArguments: {
Tool: '1 path', Solids: '1 path',
Target: '1 path', Tools: '1 path',
}, },
commandName, commandName,
}) })

View File

@ -5,6 +5,7 @@ import { uuidv4 } from '@src/lib/utils'
import type { HomePageFixture } from '@e2e/playwright/fixtures/homePageFixture' import type { HomePageFixture } from '@e2e/playwright/fixtures/homePageFixture'
import type { SceneFixture } from '@e2e/playwright/fixtures/sceneFixture' import type { SceneFixture } from '@e2e/playwright/fixtures/sceneFixture'
import type { ToolbarFixture } from '@e2e/playwright/fixtures/toolbarFixture' import type { ToolbarFixture } from '@e2e/playwright/fixtures/toolbarFixture'
import type { CmdBarFixture } from '@e2e/playwright/fixtures/cmdBarFixture'
import { getUtils } from '@e2e/playwright/test-utils' import { getUtils } from '@e2e/playwright/test-utils'
import { expect, test } from '@e2e/playwright/zoo-test' import { expect, test } from '@e2e/playwright/zoo-test'
@ -14,13 +15,18 @@ test.describe('Can create sketches on all planes and their back sides', () => {
homePage: HomePageFixture, homePage: HomePageFixture,
scene: SceneFixture, scene: SceneFixture,
toolbar: ToolbarFixture, toolbar: ToolbarFixture,
cmdBar: CmdBarFixture,
plane: string, plane: string,
clickCoords: { x: number; y: number } clickCoords: { x: number; y: number }
) => { ) => {
const u = await getUtils(page) const u = await getUtils(page)
// await page.addInitScript(() => {
// localStorage.setItem('persistCode', '@settings(defaultLengthUnit = in)')
// })
await page.setBodyDimensions({ width: 1200, height: 500 }) await page.setBodyDimensions({ width: 1200, height: 500 })
await homePage.goToModelingScene() await homePage.goToModelingScene()
// await scene.settled(cmdBar)
const XYPlanRed: [number, number, number] = [98, 50, 51] const XYPlanRed: [number, number, number] = [98, 50, 51]
await scene.expectPixelColor(XYPlanRed, { x: 700, y: 300 }, 15) await scene.expectPixelColor(XYPlanRed, { x: 700, y: 300 }, 15)
@ -119,12 +125,166 @@ test.describe('Can create sketches on all planes and their back sides', () => {
] ]
for (const config of planeConfigs) { for (const config of planeConfigs) {
test(config.plane, async ({ page, homePage, scene, toolbar }) => { test(config.plane, async ({ page, homePage, scene, toolbar, cmdBar }) => {
await sketchOnPlaneAndBackSideTest( await sketchOnPlaneAndBackSideTest(
page, page,
homePage, homePage,
scene, scene,
toolbar, toolbar,
cmdBar,
config.plane,
config.coords
)
})
}
})
test.describe('Can create sketches on offset planes and their back sides', () => {
const sketchOnPlaneAndBackSideTest = async (
page: Page,
homePage: HomePageFixture,
scene: SceneFixture,
toolbar: ToolbarFixture,
cmdbar: CmdBarFixture,
plane: string,
clickCoords: { x: number; y: number }
) => {
const u = await getUtils(page)
await page.addInitScript(() => {
localStorage.setItem(
'persistCode',
`@settings(defaultLengthUnit = in)
xyPlane = offsetPlane(XY, offset = 0.05)
xzPlane = offsetPlane(XZ, offset = 0.05)
yzPlane = offsetPlane(YZ, offset = 0.05)
`
)
})
await page.setBodyDimensions({ width: 1200, height: 500 })
await homePage.goToModelingScene()
// await scene.settled(cmdbar)
const XYPlanRed: [number, number, number] = [74, 74, 74]
await scene.expectPixelColor(XYPlanRed, { x: 700, y: 300 }, 15)
await u.openDebugPanel()
const coord =
plane === '-XY' || plane === '-YZ' || plane === 'XZ' ? -100 : 100
const camCommand: EngineCommand = {
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
type: 'default_camera_look_at',
center: { x: 0, y: 0, z: 0 },
vantage: { x: coord, y: coord, z: coord },
up: { x: 0, y: 0, z: 1 },
},
}
const updateCamCommand: EngineCommand = {
type: 'modeling_cmd_req',
cmd_id: uuidv4(),
cmd: {
type: 'default_camera_get_settings',
},
}
const prefix = plane.length === 3 ? '-' : ''
const planeName = plane
.slice(plane.length === 3 ? 1 : 0)
.toLocaleLowerCase()
const codeLine1 = `sketch001 = startSketchOn(${prefix}${planeName}Plane)`
const codeLine2 = `profile001 = startProfile(sketch001, at = [${0.91 + (plane[0] === '-' ? 0.01 : 0)}, -${1.21 + (plane[0] === '-' ? 0.01 : 0)}])`
await u.openDebugPanel()
await u.clearCommandLogs()
await page.getByRole('button', { name: 'Start Sketch' }).click()
await u.sendCustomCmd(camCommand)
await page.waitForTimeout(100)
await u.sendCustomCmd(updateCamCommand)
await u.closeDebugPanel()
await toolbar.openFeatureTreePane()
await toolbar.getDefaultPlaneVisibilityButton('XY').click()
await toolbar.getDefaultPlaneVisibilityButton('XZ').click()
await toolbar.getDefaultPlaneVisibilityButton('YZ').click()
await expect(
toolbar
.getDefaultPlaneVisibilityButton('YZ')
.locator('[aria-label="eye crossed out"]')
).toBeVisible()
await page.mouse.click(clickCoords.x, clickCoords.y)
await page.waitForTimeout(600) // wait for animation
await toolbar.waitUntilSketchingReady()
await expect(
page.getByRole('button', { name: 'line Line', exact: true })
).toBeVisible()
await u.closeDebugPanel()
await page.mouse.click(707, 393)
await expect(page.locator('.cm-content')).toContainText(codeLine1)
await expect(page.locator('.cm-content')).toContainText(codeLine2)
await page
.getByRole('button', { name: 'line Line', exact: true })
.first()
.click()
await u.openAndClearDebugPanel()
await page.getByRole('button', { name: 'Exit Sketch' }).click()
await u.expectCmdLog('[data-message-type="execution-done"]')
await u.clearCommandLogs()
await u.removeCurrentCode()
}
const planeConfigs = [
{
plane: 'XY',
coords: { x: 600, y: 388 },
description: 'red plane',
},
{
plane: 'YZ',
coords: { x: 700, y: 250 },
description: 'green plane',
},
{
plane: 'XZ',
coords: { x: 684, y: 427 },
description: 'blue plane',
},
{
plane: '-XY',
coords: { x: 600, y: 118 },
description: 'back of red plane',
},
{
plane: '-YZ',
coords: { x: 700, y: 219 },
description: 'back of green plane',
},
{
plane: '-XZ',
coords: { x: 700, y: 80 },
description: 'back of blue plane',
},
]
for (const config of planeConfigs) {
test(config.plane, async ({ page, homePage, scene, toolbar, cmdBar }) => {
await sketchOnPlaneAndBackSideTest(
page,
homePage,
scene,
toolbar,
cmdBar,
config.plane, config.plane,
config.coords config.coords
) )

View File

@ -525,7 +525,9 @@ test.describe('Command bar tests', () => {
const projectName = 'test' const projectName = 'test'
const beforeKclCode = `a = 5 const beforeKclCode = `a = 5
b = a * a b = a * a
c = 3 + a` c = 3 + a
theta = 45deg
`
await context.folderSetupFn(async (dir) => { await context.folderSetupFn(async (dir) => {
const testProject = join(dir, projectName) const testProject = join(dir, projectName)
await fsp.mkdir(testProject, { recursive: true }) await fsp.mkdir(testProject, { recursive: true })
@ -615,9 +617,45 @@ c = 3 + a`
stage: 'commandBarClosed', stage: 'commandBarClosed',
}) })
}) })
await test.step(`Edit a parameter with explicit units via command bar`, async () => {
await cmdBar.cmdBarOpenBtn.click()
await cmdBar.chooseCommand('edit parameter')
await cmdBar
.selectOption({
name: 'theta',
})
.click()
await cmdBar.expectState({
stage: 'arguments',
commandName: 'Edit parameter',
currentArgKey: 'value',
currentArgValue: '45deg',
headerArguments: {
Name: 'theta',
Value: '',
},
highlightedHeaderArg: 'value',
})
await cmdBar.argumentInput
.locator('[contenteditable]')
.fill('45deg + 1deg')
await cmdBar.progressCmdBar()
await cmdBar.expectState({
stage: 'review',
commandName: 'Edit parameter',
headerArguments: {
Name: 'theta',
Value: '46deg',
},
})
await cmdBar.progressCmdBar()
await cmdBar.expectState({
stage: 'commandBarClosed',
})
})
await editor.expectEditor.toContain( await editor.expectEditor.toContain(
`a = 5b = a * amyParameter001 = ${newValue}c = 3 + a` `a = 5b = a * amyParameter001 = ${newValue}c = 3 + atheta = 45deg + 1deg`
) )
}) })

View File

@ -288,7 +288,7 @@ a1 = startSketchOn(offsetPlane(XY, offset = 10))
// error text on hover // error text on hover
await page.hover('.cm-lint-marker-info') await page.hover('.cm-lint-marker-info')
await expect( await expect(
page.getByText('Identifiers must be lowerCamelCase').first() page.getByText('Identifiers should be lowerCamelCase').first()
).toBeVisible() ).toBeVisible()
await page.locator('#code-pane button:first-child').click() await page.locator('#code-pane button:first-child').click()
@ -314,7 +314,7 @@ sketch_001 = startSketchOn(XY)
// error text on hover // error text on hover
await page.hover('.cm-lint-marker-info') await page.hover('.cm-lint-marker-info')
await expect( await expect(
page.getByText('Identifiers must be lowerCamelCase').first() page.getByText('Identifiers should be lowerCamelCase').first()
).toBeVisible() ).toBeVisible()
}) })
@ -511,7 +511,7 @@ sketch_001 = startSketchOn(XY)
// error text on hover // error text on hover
await page.hover('.cm-lint-marker-info') await page.hover('.cm-lint-marker-info')
await expect( await expect(
page.getByText('Identifiers must be lowerCamelCase').first() page.getByText('Identifiers should be lowerCamelCase').first()
).toBeVisible() ).toBeVisible()
// focus the editor // focus the editor
@ -539,7 +539,7 @@ sketch_001 = startSketchOn(XY)
// error text on hover // error text on hover
await page.hover('.cm-lint-marker-info') await page.hover('.cm-lint-marker-info')
await expect( await expect(
page.getByText('Identifiers must be lowerCamelCase').first() page.getByText('Identifiers should be lowerCamelCase').first()
).toBeVisible() ).toBeVisible()
}) })
@ -681,7 +681,7 @@ a1 = startSketchOn(offsetPlane(XY, offset = 10))
// error text on hover // error text on hover
await page.hover('.cm-lint-marker-info') await page.hover('.cm-lint-marker-info')
await expect( await expect(
page.getByText('Identifiers must be lowerCamelCase').first() page.getByText('Identifiers should be lowerCamelCase').first()
).toBeVisible() ).toBeVisible()
// select the line that's causing the error and delete it // select the line that's causing the error and delete it

View File

@ -274,6 +274,13 @@ export class ToolbarFixture {
.nth(operationIndex) .nth(operationIndex)
} }
getDefaultPlaneVisibilityButton(plane: 'XY' | 'XZ' | 'YZ' = 'XY') {
const index = plane === 'XZ' ? 0 : plane === 'XY' ? 1 : 2
return this.featureTreePane
.getByTestId('feature-tree-visibility-toggle')
.nth(index)
}
/** /**
* View source on a specific operation in the Feature Tree pane. * View source on a specific operation in the Feature Tree pane.
* @param operationName The name of the operation type * @param operationName The name of the operation type

View File

@ -7,6 +7,7 @@ import type { SceneFixture } from '@e2e/playwright/fixtures/sceneFixture'
import type { ToolbarFixture } from '@e2e/playwright/fixtures/toolbarFixture' import type { ToolbarFixture } from '@e2e/playwright/fixtures/toolbarFixture'
import { expect, test } from '@e2e/playwright/zoo-test' import { expect, test } from '@e2e/playwright/zoo-test'
import { bracket } from '@e2e/playwright/fixtures/bracket' import { bracket } from '@e2e/playwright/fixtures/bracket'
import type { CmdBarSerialised } from '@e2e/playwright/fixtures/cmdBarFixture'
// test file is for testing point an click code gen functionality that's not sketch mode related // test file is for testing point an click code gen functionality that's not sketch mode related
@ -135,17 +136,17 @@ test.describe('Point-and-click tests', () => {
highlightedHeaderArg: 'length', highlightedHeaderArg: 'length',
commandName: 'Extrude', commandName: 'Extrude',
}) })
await page.keyboard.insertText('width - 0.001') await page.keyboard.insertText('width - 0.001in')
await cmdBar.progressCmdBar() await cmdBar.progressCmdBar()
await cmdBar.expectState({ await cmdBar.expectState({
stage: 'review', stage: 'review',
headerArguments: { headerArguments: {
Length: '4.999', Length: '4.999in',
}, },
commandName: 'Extrude', commandName: 'Extrude',
}) })
await cmdBar.progressCmdBar() await cmdBar.progressCmdBar()
await editor.expectEditor.toContain('extrude(length = width - 0.001)') await editor.expectEditor.toContain('extrude(length = width - 0.001in)')
}) })
await test.step(`Edit second extrude via feature tree`, async () => { await test.step(`Edit second extrude via feature tree`, async () => {
@ -1141,6 +1142,20 @@ openSketch = startSketchOn(XY)
}) })
}) })
const initialCmdBarStateHelix: CmdBarSerialised = {
stage: 'arguments',
currentArgKey: 'mode',
currentArgValue: '',
headerArguments: {
Mode: '',
AngleStart: '',
Revolutions: '',
Radius: '',
},
highlightedHeaderArg: 'mode',
commandName: 'Helix',
}
test('Helix point-and-click on default axis', async ({ test('Helix point-and-click on default axis', async ({
context, context,
page, page,
@ -1150,30 +1165,14 @@ openSketch = startSketchOn(XY)
toolbar, toolbar,
cmdBar, cmdBar,
}) => { }) => {
// One dumb hardcoded screen pixel value const expectedOutput = `helix001 = helix( axis = X, radius = 5, length = 5, revolutions = 1, angleStart = 270,)`
const testPoint = { x: 620, y: 257 }
const expectedOutput = `helix001 = helix( axis = X, radius = 5, length = 5, revolutions = 1, angleStart = 270, ccw = false,)`
const expectedLine = `axis=X,` const expectedLine = `axis=X,`
await homePage.goToModelingScene() await homePage.goToModelingScene()
await scene.connectionEstablished() await scene.connectionEstablished()
await test.step(`Go through the command bar flow`, async () => { await test.step(`Go through the command bar flow`, async () => {
await toolbar.helixButton.click() await toolbar.helixButton.click()
await cmdBar.expectState({ await cmdBar.expectState(initialCmdBarStateHelix)
stage: 'arguments',
currentArgKey: 'mode',
currentArgValue: '',
headerArguments: {
Mode: '',
AngleStart: '',
Revolutions: '',
Radius: '',
CounterClockWise: '',
},
highlightedHeaderArg: 'mode',
commandName: 'Helix',
})
await cmdBar.progressCmdBar() await cmdBar.progressCmdBar()
await expect.poll(() => page.getByText('Axis').count()).toBe(6) await expect.poll(() => page.getByText('Axis').count()).toBe(6)
await cmdBar.progressCmdBar() await cmdBar.progressCmdBar()
@ -1190,7 +1189,6 @@ openSketch = startSketchOn(XY)
AngleStart: '', AngleStart: '',
Length: '', Length: '',
Radius: '', Radius: '',
CounterClockWise: '',
}, },
commandName: 'Helix', commandName: 'Helix',
}) })
@ -1207,11 +1205,10 @@ openSketch = startSketchOn(XY)
Revolutions: '1', Revolutions: '1',
Length: '5', Length: '5',
Radius: '5', Radius: '5',
CounterClockWise: '',
}, },
commandName: 'Helix', commandName: 'Helix',
}) })
await cmdBar.progressCmdBar() await cmdBar.submit()
}) })
await test.step(`Confirm code is added to the editor, scene has changed`, async () => { await test.step(`Confirm code is added to the editor, scene has changed`, async () => {
@ -1221,8 +1218,6 @@ openSketch = startSketchOn(XY)
activeLines: [expectedLine], activeLines: [expectedLine],
highlightedCode: '', highlightedCode: '',
}) })
// Red plane is now gone, white helix is there
await scene.expectPixelColor([250, 250, 250], testPoint, 15)
}) })
await test.step(`Edit helix through the feature tree`, async () => { await test.step(`Edit helix through the feature tree`, async () => {
@ -1234,21 +1229,18 @@ openSketch = startSketchOn(XY)
await cmdBar.expectState({ await cmdBar.expectState({
commandName: 'Helix', commandName: 'Helix',
stage: 'arguments', stage: 'arguments',
currentArgKey: 'CounterClockWise', currentArgKey: 'length',
currentArgValue: '', currentArgValue: '5',
headerArguments: { headerArguments: {
Axis: 'X', Axis: 'X',
AngleStart: '270', AngleStart: '270',
Revolutions: '1', Revolutions: '1',
Radius: '5', Radius: '5',
Length: initialInput, Length: initialInput,
CounterClockWise: '',
}, },
highlightedHeaderArg: 'CounterClockWise', highlightedHeaderArg: 'length',
}) })
await page.keyboard.press('Shift+Backspace') await page.keyboard.insertText(newInput)
await expect(cmdBar.currentArgumentInput).toBeVisible()
await cmdBar.currentArgumentInput.locator('.cm-content').fill(newInput)
await cmdBar.progressCmdBar() await cmdBar.progressCmdBar()
await cmdBar.expectState({ await cmdBar.expectState({
stage: 'review', stage: 'review',
@ -1258,11 +1250,10 @@ openSketch = startSketchOn(XY)
Revolutions: '1', Revolutions: '1',
Radius: '5', Radius: '5',
Length: newInput, Length: newInput,
CounterClockWise: '',
}, },
commandName: 'Helix', commandName: 'Helix',
}) })
await cmdBar.progressCmdBar() await cmdBar.submit()
await toolbar.closeFeatureTreePane() await toolbar.closeFeatureTreePane()
await editor.openPane() await editor.openPane()
await editor.expectEditor.toContain('length = ' + newInput) await editor.expectEditor.toContain('length = ' + newInput)
@ -1273,174 +1264,238 @@ openSketch = startSketchOn(XY)
const operationButton = await toolbar.getFeatureTreeOperation('Helix', 0) const operationButton = await toolbar.getFeatureTreeOperation('Helix', 0)
await operationButton.click({ button: 'left' }) await operationButton.click({ button: 'left' })
await page.keyboard.press('Delete') await page.keyboard.press('Delete')
// Red plane is back await scene.settled(cmdBar)
await scene.expectPixelColor([96, 52, 52], testPoint, 15) await editor.expectEditor.not.toContain('helix')
await expect(
await toolbar.getFeatureTreeOperation('Helix', 0)
).not.toBeVisible()
}) })
}) })
const helixCases = [ test(`Helix point-and-click around segment`, async ({
{ context,
selectionType: 'segment', page,
testPoint: { x: 513, y: 221 }, homePage,
expectedOutput: `helix001 = helix( axis = seg01, radius = 1, revolutions = 20, angleStart = 0, ccw = false,)`, scene,
expectedEditedOutput: `helix001 = helix( axis = seg01, radius = 5, revolutions = 20, angleStart = 0, ccw = false,)`, editor,
}, toolbar,
{ cmdBar,
selectionType: 'sweepEdge', }) => {
testPoint: { x: 564, y: 364 }, const initialCode = `sketch001 = startSketchOn(XZ)
expectedOutput: `helix001 = helix( axis = getOppositeEdge(seg01), radius = 1, revolutions = 20, angleStart = 0, ccw = false,)`, profile001 = startProfile(sketch001, at = [0, 0])
expectedEditedOutput: `helix001 = helix( axis = getOppositeEdge(seg01), radius = 5, revolutions = 20, angleStart = 0, ccw = false,)`, |> yLine(length = 100)
}, |> line(endAbsolute = [100, 0])
] |> line(endAbsolute = [profileStartX(%), profileStartY(%)])
helixCases.map( |> close()`
({ selectionType, testPoint, expectedOutput, expectedEditedOutput }) => { await context.addInitScript((initialCode) => {
test(`Helix point-and-click around ${selectionType}`, async ({ localStorage.setItem('persistCode', initialCode)
context, }, initialCode)
page, await page.setBodyDimensions({ width: 1000, height: 500 })
homePage, await homePage.goToModelingScene()
scene, await scene.settled(cmdBar)
editor,
toolbar,
cmdBar,
}) => {
page.on('console', console.log)
const initialCode = `sketch001 = startSketchOn(XZ)
profile001 = startProfile(sketch001, at = [0, 0])
|> yLine(length = 100)
|> line(endAbsolute = [100, 0])
|> line(endAbsolute = [profileStartX(%), profileStartY(%)])
|> close()
extrude001 = extrude(profile001, length = 100)`
// One dumb hardcoded screen pixel value await test.step(`Go through the command bar flow`, async () => {
const [clickOnEdge] = scene.makeMouseHelpers(testPoint.x, testPoint.y) await toolbar.closePane('code')
await toolbar.helixButton.click()
await context.addInitScript((initialCode) => { await cmdBar.expectState(initialCmdBarStateHelix)
localStorage.setItem('persistCode', initialCode) await cmdBar.selectOption({ name: 'Edge' }).click()
}, initialCode) await editor.selectText('yLine(length = 100)')
await page.setBodyDimensions({ width: 1000, height: 500 }) await cmdBar.progressCmdBar()
await homePage.goToModelingScene() await page.keyboard.insertText('1')
await scene.settled(cmdBar) await cmdBar.progressCmdBar()
await page.keyboard.insertText('2')
await test.step(`Go through the command bar flow`, async () => { await cmdBar.progressCmdBar()
await toolbar.closePane('code') await page.keyboard.insertText('3')
await toolbar.helixButton.click() await cmdBar.progressCmdBar()
await cmdBar.expectState({ await cmdBar.expectState({
stage: 'arguments', stage: 'review',
currentArgKey: 'mode', headerArguments: {
currentArgValue: '', Mode: 'Edge',
headerArguments: { Edge: `1 segment`,
AngleStart: '', AngleStart: '2',
Mode: '', Revolutions: '1',
CounterClockWise: '', Radius: '3',
Radius: '', },
Revolutions: '', commandName: 'Helix',
},
highlightedHeaderArg: 'mode',
commandName: 'Helix',
})
await cmdBar.selectOption({ name: 'Edge' }).click()
await expect
.poll(() => page.getByText('Please select one').count())
.toBe(1)
await clickOnEdge()
await page.waitForTimeout(1000)
await cmdBar.progressCmdBar()
await page.waitForTimeout(1000)
await cmdBar.argumentInput.focus()
await page.waitForTimeout(1000)
await page.keyboard.insertText('20')
await cmdBar.progressCmdBar()
await page.keyboard.insertText('0')
await cmdBar.progressCmdBar()
await page.keyboard.insertText('1')
await cmdBar.progressCmdBar()
await page.keyboard.insertText('100')
await cmdBar.expectState({
stage: 'review',
headerArguments: {
Mode: 'Edge',
Edge: `1 ${selectionType}`,
AngleStart: '0',
Revolutions: '20',
Radius: '1',
CounterClockWise: '',
},
commandName: 'Helix',
})
await cmdBar.progressCmdBar()
await page.waitForTimeout(1000)
})
await test.step(`Confirm code is added to the editor, scene has changed`, async () => {
await toolbar.openPane('code')
await editor.expectEditor.toContain(expectedOutput)
await toolbar.closePane('code')
})
await test.step(`Edit helix through the feature tree`, async () => {
await toolbar.openPane('feature-tree')
const operationButton = await toolbar.getFeatureTreeOperation(
'Helix',
0
)
await operationButton.dblclick()
const initialInput = '1'
const newInput = '5'
await cmdBar.expectState({
commandName: 'Helix',
stage: 'arguments',
currentArgKey: 'CounterClockWise',
currentArgValue: '',
headerArguments: {
AngleStart: '0',
Revolutions: '20',
Radius: initialInput,
CounterClockWise: '',
},
highlightedHeaderArg: 'CounterClockWise',
})
await page
.getByRole('button', { name: 'radius', exact: false })
.click()
await expect(cmdBar.currentArgumentInput).toBeVisible()
await cmdBar.currentArgumentInput
.locator('.cm-content')
.fill(newInput)
await cmdBar.progressCmdBar()
await cmdBar.expectState({
stage: 'review',
headerArguments: {
AngleStart: '0',
Revolutions: '20',
Radius: newInput,
CounterClockWise: '',
},
commandName: 'Helix',
})
await cmdBar.progressCmdBar()
await toolbar.closePane('feature-tree')
await toolbar.openPane('code')
await editor.expectEditor.toContain(expectedEditedOutput)
await toolbar.closePane('code')
})
await test.step('Delete helix via feature tree selection', async () => {
await toolbar.openPane('feature-tree')
const operationButton = await toolbar.getFeatureTreeOperation(
'Helix',
0
)
await operationButton.click({ button: 'left' })
await page.keyboard.press('Delete')
await editor.expectEditor.not.toContain(expectedEditedOutput)
await expect(
await toolbar.getFeatureTreeOperation('Helix', 0)
).not.toBeVisible()
})
}) })
} await cmdBar.submit()
) await scene.settled(cmdBar)
})
await test.step(`Confirm code is added to the editor, scene has changed`, async () => {
await toolbar.openPane('code')
await editor.expectEditor.toContain(
`
helix001 = helix(
axis = seg01,
radius = 3,
revolutions = 1,
angleStart = 2,
)`,
{ shouldNormalise: true }
)
await toolbar.closePane('code')
})
})
test(`Helix point-and-click around sweepEdge with edit and delete flows`, async ({
context,
page,
homePage,
scene,
editor,
toolbar,
cmdBar,
}) => {
const initialCode = `sketch001 = startSketchOn(XZ)
profile001 = startProfile(sketch001, at = [0, 0])
|> yLine(length = 100)
|> line(endAbsolute = [100, 0])
|> line(endAbsolute = [profileStartX(%), profileStartY(%)])
|> close()
extrude001 = extrude(profile001, length = 100)`
// One dumb hardcoded screen pixel value to click on the sweepEdge, can't think of another way?
const testPoint = { x: 564, y: 364 }
const [clickOnEdge] = scene.makeMouseHelpers(testPoint.x, testPoint.y)
await context.addInitScript((initialCode) => {
localStorage.setItem('persistCode', initialCode)
}, initialCode)
await page.setBodyDimensions({ width: 1000, height: 500 })
await homePage.goToModelingScene()
await scene.settled(cmdBar)
await test.step(`Go through the command bar flow`, async () => {
await toolbar.closePane('code')
await toolbar.helixButton.click()
await cmdBar.expectState(initialCmdBarStateHelix)
await cmdBar.selectOption({ name: 'Edge' }).click()
await expect
.poll(() => page.getByText('Please select one').count())
.toBe(1)
await clickOnEdge()
await cmdBar.progressCmdBar()
await cmdBar.argumentInput.focus()
await page.keyboard.insertText('20')
await cmdBar.progressCmdBar()
await page.keyboard.insertText('0')
await cmdBar.progressCmdBar()
await page.keyboard.insertText('1')
await cmdBar.progressCmdBar()
await page.keyboard.insertText('100')
await cmdBar.expectState({
stage: 'review',
headerArguments: {
Mode: 'Edge',
Edge: `1 sweepEdge`,
AngleStart: '0',
Revolutions: '20',
Radius: '1',
},
commandName: 'Helix',
})
await cmdBar.submit()
await scene.settled(cmdBar)
})
await test.step(`Confirm code is added to the editor, scene has changed`, async () => {
await toolbar.openPane('code')
await editor.expectEditor.toContain(
`
helix001 = helix(
axis = getOppositeEdge(seg01),
radius = 1,
revolutions = 20,
angleStart = 0,
)`,
{ shouldNormalise: true }
)
await toolbar.closePane('code')
})
await test.step(`Edit helix through the feature tree`, async () => {
await toolbar.openPane('feature-tree')
const operationButton = await toolbar.getFeatureTreeOperation('Helix', 0)
await operationButton.dblclick()
const initialInput = '1'
const newInput = '5'
await cmdBar.expectState({
commandName: 'Helix',
stage: 'arguments',
currentArgKey: 'radius',
currentArgValue: initialInput,
headerArguments: {
AngleStart: '0',
Revolutions: '20',
Radius: initialInput,
},
highlightedHeaderArg: 'radius',
})
await page.keyboard.insertText(newInput)
await cmdBar.progressCmdBar()
await cmdBar.expectState({
stage: 'review',
headerArguments: {
AngleStart: '0',
Revolutions: '20',
Radius: newInput,
},
commandName: 'Helix',
})
await cmdBar.clickOptionalArgument('ccw')
await cmdBar.expectState({
commandName: 'Helix',
stage: 'arguments',
currentArgKey: 'CounterClockWise',
currentArgValue: '',
headerArguments: {
AngleStart: '0',
Revolutions: '20',
Radius: newInput,
CounterClockWise: '',
},
highlightedHeaderArg: 'CounterClockWise',
})
await cmdBar.selectOption({ name: 'True' }).click()
await cmdBar.expectState({
stage: 'review',
headerArguments: {
AngleStart: '0',
Revolutions: '20',
Radius: newInput,
CounterClockWise: '',
},
commandName: 'Helix',
})
await cmdBar.submit()
await toolbar.closePane('feature-tree')
await toolbar.openPane('code')
await editor.expectEditor.toContain(
`
helix001 = helix(
axis = getOppositeEdge(seg01),
radius = 5,
revolutions = 20,
angleStart = 0,
ccw = true,
)`,
{ shouldNormalise: true }
)
await toolbar.closePane('code')
})
await test.step('Delete helix via feature tree selection', async () => {
await toolbar.openPane('feature-tree')
const operationButton = await toolbar.getFeatureTreeOperation('Helix', 0)
await operationButton.click({ button: 'left' })
await page.keyboard.press('Delete')
await editor.expectEditor.not.toContain('helix')
await expect(
await toolbar.getFeatureTreeOperation('Helix', 0)
).not.toBeVisible()
})
})
test('Helix point-and-click on cylinder', async ({ test('Helix point-and-click on cylinder', async ({
context, context,
@ -1470,26 +1525,12 @@ extrude001 = extrude(profile001, length = 100)
// One dumb hardcoded screen pixel value // One dumb hardcoded screen pixel value
const testPoint = { x: 620, y: 257 } const testPoint = { x: 620, y: 257 }
const [clickOnWall] = scene.makeMouseHelpers(testPoint.x, testPoint.y) const [clickOnWall] = scene.makeMouseHelpers(testPoint.x, testPoint.y)
const expectedOutput = `helix001 = helix( cylinder = extrude001, revolutions = 1, angleStart = 360, ccw = false,)` const expectedOutput = `helix001 = helix(cylinder = extrude001, revolutions = 1, angleStart = 360)`
const expectedLine = `cylinder = extrude001,` const expectedEditedOutput = `helix001 = helix(cylinder = extrude001, revolutions = 1, angleStart = 10)`
const expectedEditedOutput = `helix001 = helix( cylinder = extrude001, revolutions = 1, angleStart = 360, ccw = true,)`
await test.step(`Go through the command bar flow`, async () => { await test.step(`Go through the command bar flow`, async () => {
await toolbar.helixButton.click() await toolbar.helixButton.click()
await cmdBar.expectState({ await cmdBar.expectState(initialCmdBarStateHelix)
stage: 'arguments',
currentArgKey: 'mode',
currentArgValue: '',
headerArguments: {
Mode: '',
AngleStart: '',
Revolutions: '',
Radius: '',
CounterClockWise: '',
},
highlightedHeaderArg: 'mode',
commandName: 'Helix',
})
await cmdBar.selectOption({ name: 'Cylinder' }).click() await cmdBar.selectOption({ name: 'Cylinder' }).click()
await cmdBar.expectState({ await cmdBar.expectState({
stage: 'arguments', stage: 'arguments',
@ -1500,7 +1541,6 @@ extrude001 = extrude(profile001, length = 100)
Cylinder: '', Cylinder: '',
AngleStart: '', AngleStart: '',
Revolutions: '', Revolutions: '',
CounterClockWise: '',
}, },
highlightedHeaderArg: 'cylinder', highlightedHeaderArg: 'cylinder',
commandName: 'Helix', commandName: 'Helix',
@ -1516,18 +1556,17 @@ extrude001 = extrude(profile001, length = 100)
Cylinder: '1 face', Cylinder: '1 face',
AngleStart: '360', AngleStart: '360',
Revolutions: '1', Revolutions: '1',
CounterClockWise: '',
}, },
commandName: 'Helix', commandName: 'Helix',
}) })
await cmdBar.progressCmdBar() await cmdBar.submit()
}) })
await test.step(`Confirm code is added to the editor, scene has changed`, async () => { await test.step(`Confirm code is added to the editor, scene has changed`, async () => {
await editor.expectEditor.toContain(expectedOutput) await editor.expectEditor.toContain(expectedOutput)
await editor.expectState({ await editor.expectState({
diagnostics: [], diagnostics: [],
activeLines: [expectedLine], activeLines: [expectedOutput],
highlightedCode: '', highlightedCode: '',
}) })
}) })
@ -1539,22 +1578,21 @@ extrude001 = extrude(profile001, length = 100)
await cmdBar.expectState({ await cmdBar.expectState({
commandName: 'Helix', commandName: 'Helix',
stage: 'arguments', stage: 'arguments',
currentArgKey: 'CounterClockWise', currentArgKey: 'angleStart',
currentArgValue: '', currentArgValue: '360',
headerArguments: { headerArguments: {
AngleStart: '360', AngleStart: '360',
Revolutions: '1', Revolutions: '1',
CounterClockWise: '',
}, },
highlightedHeaderArg: 'CounterClockWise', highlightedHeaderArg: 'angleStart',
}) })
await cmdBar.selectOption({ name: 'True' }).click() await page.keyboard.insertText('10')
await cmdBar.progressCmdBar()
await cmdBar.expectState({ await cmdBar.expectState({
stage: 'review', stage: 'review',
headerArguments: { headerArguments: {
AngleStart: '360', AngleStart: '10',
Revolutions: '1', Revolutions: '1',
CounterClockWise: 'true',
}, },
commandName: 'Helix', commandName: 'Helix',
}) })

Binary file not shown.

Before

Width:  |  Height:  |  Size: 52 KiB

After

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 58 KiB

After

Width:  |  Height:  |  Size: 58 KiB

18
flake.lock generated
View File

@ -20,11 +20,11 @@
}, },
"nixpkgs": { "nixpkgs": {
"locked": { "locked": {
"lastModified": 1745998881, "lastModified": 1750865895,
"narHash": "sha256-vonyYAKJSlsX4n9GCsS0pHxR6yCrfqBIuGvANlkwG6U=", "narHash": "sha256-p2dWAQcLVzquy9LxYCZPwyUdugw78Qv3ChvnX755qHA=",
"owner": "NixOS", "owner": "NixOS",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "423d2df5b04b4ee7688c3d71396e872afa236a89", "rev": "61c0f513911459945e2cb8bf333dc849f1b976ff",
"type": "github" "type": "github"
}, },
"original": { "original": {
@ -36,11 +36,11 @@
}, },
"nixpkgs_2": { "nixpkgs_2": {
"locked": { "locked": {
"lastModified": 1745998881, "lastModified": 1750865895,
"narHash": "sha256-vonyYAKJSlsX4n9GCsS0pHxR6yCrfqBIuGvANlkwG6U=", "narHash": "sha256-p2dWAQcLVzquy9LxYCZPwyUdugw78Qv3ChvnX755qHA=",
"owner": "NixOS", "owner": "NixOS",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "423d2df5b04b4ee7688c3d71396e872afa236a89", "rev": "61c0f513911459945e2cb8bf333dc849f1b976ff",
"type": "github" "type": "github"
}, },
"original": { "original": {
@ -78,11 +78,11 @@
"nixpkgs": "nixpkgs_3" "nixpkgs": "nixpkgs_3"
}, },
"locked": { "locked": {
"lastModified": 1745980514, "lastModified": 1750964660,
"narHash": "sha256-CITAeiuXGjDvT5iZBXr6vKVWQwsUQLJUMFO91bfJFC4=", "narHash": "sha256-YQ6EyFetjH1uy5JhdhRdPe6cuNXlYpMAQePFfZj4W7M=",
"owner": "oxalica", "owner": "oxalica",
"repo": "rust-overlay", "repo": "rust-overlay",
"rev": "7fbdae44b0f40ea432e46fd152ad8be0f8f41ad6", "rev": "04f0fcfb1a50c63529805a798b4b5c21610ff390",
"type": "github" "type": "github"
}, },
"original": { "original": {

View File

@ -125,18 +125,57 @@ test('Shows a loading spinner when uninitialized credit count', async () => {
await expect(queryByTestId('spinner')).toBeVisible() await expect(queryByTestId('spinner')).toBeVisible()
}) })
test('Shows the total credits for Unknown subscription', async () => { const unKnownTierData = {
const data = { balance: {
balance: { monthlyApiCreditsRemaining: 10,
monthlyApiCreditsRemaining: 10, stableApiCreditsRemaining: 25,
stableApiCreditsRemaining: 25, },
}, subscriptions: {
subscriptions: { monthlyPayAsYouGoApiCreditsTotal: 20,
monthlyPayAsYouGoApiCreditsTotal: 20, name: "unknown",
name: "unknown",
}
} }
}
const freeTierData = {
balance: {
monthlyApiCreditsRemaining: 10,
stableApiCreditsRemaining: 0,
},
subscriptions: {
monthlyPayAsYouGoApiCreditsTotal: 20,
name: "free",
}
}
const proTierData = {
// These are all ignored
balance: {
monthlyApiCreditsRemaining: 10,
stableApiCreditsRemaining: 0,
},
subscriptions: {
// This should be ignored because it's Pro tier.
monthlyPayAsYouGoApiCreditsTotal: 20,
name: "pro",
}
}
const enterpriseTierData = {
// These are all ignored, user is part of an org.
balance: {
monthlyApiCreditsRemaining: 10,
stableApiCreditsRemaining: 0,
},
subscriptions: {
// This should be ignored because it's Pro tier.
monthlyPayAsYouGoApiCreditsTotal: 20,
// This should be ignored because the user is part of an Org.
name: "free",
}
}
test('Shows the total credits for Unknown subscription', async () => {
const data = unKnownTierData
server.use( server.use(
http.get('*/user/payment/balance', (req, res, ctx) => { http.get('*/user/payment/balance', (req, res, ctx) => {
return HttpResponse.json(createUserPaymentBalanceResponse(data.balance)) return HttpResponse.json(createUserPaymentBalanceResponse(data.balance))
@ -166,17 +205,7 @@ test('Shows the total credits for Unknown subscription', async () => {
}) })
test('Progress bar reflects ratio left of Free subscription', async () => { test('Progress bar reflects ratio left of Free subscription', async () => {
const data = { const data = freeTierData
balance: {
monthlyApiCreditsRemaining: 10,
stableApiCreditsRemaining: 0,
},
subscriptions: {
monthlyPayAsYouGoApiCreditsTotal: 20,
name: "free",
}
}
server.use( server.use(
http.get('*/user/payment/balance', (req, res, ctx) => { http.get('*/user/payment/balance', (req, res, ctx) => {
return HttpResponse.json(createUserPaymentBalanceResponse(data.balance)) return HttpResponse.json(createUserPaymentBalanceResponse(data.balance))
@ -212,19 +241,7 @@ test('Progress bar reflects ratio left of Free subscription', async () => {
}) })
}) })
test('Shows infinite credits for Pro subscription', async () => { test('Shows infinite credits for Pro subscription', async () => {
const data = { const data = proTierData
// These are all ignored
balance: {
monthlyApiCreditsRemaining: 10,
stableApiCreditsRemaining: 0,
},
subscriptions: {
// This should be ignored because it's Pro tier.
monthlyPayAsYouGoApiCreditsTotal: 20,
name: "pro",
}
}
server.use( server.use(
http.get('*/user/payment/balance', (req, res, ctx) => { http.get('*/user/payment/balance', (req, res, ctx) => {
return HttpResponse.json(createUserPaymentBalanceResponse(data.balance)) return HttpResponse.json(createUserPaymentBalanceResponse(data.balance))
@ -255,19 +272,7 @@ test('Shows infinite credits for Pro subscription', async () => {
await expect(queryByTestId('billing-remaining-progress-bar-inline')).toBe(null) await expect(queryByTestId('billing-remaining-progress-bar-inline')).toBe(null)
}) })
test('Shows infinite credits for Enterprise subscription', async () => { test('Shows infinite credits for Enterprise subscription', async () => {
const data = { const data = enterpriseTierData
// These are all ignored, user is part of an org.
balance: {
monthlyApiCreditsRemaining: 10,
stableApiCreditsRemaining: 0,
},
subscriptions: {
// This should be ignored because it's Pro tier.
monthlyPayAsYouGoApiCreditsTotal: 20,
// This should be ignored because the user is part of an Org.
name: "free",
}
}
server.use( server.use(
http.get('*/user/payment/balance', (req, res, ctx) => { http.get('*/user/payment/balance', (req, res, ctx) => {
@ -297,3 +302,58 @@ test('Shows infinite credits for Enterprise subscription', async () => {
await expect(queryByTestId('infinity')).toBeVisible() await expect(queryByTestId('infinity')).toBeVisible()
await expect(queryByTestId('billing-remaining-progress-bar-inline')).toBe(null) await expect(queryByTestId('billing-remaining-progress-bar-inline')).toBe(null)
}) })
test('Show upgrade button if credits are not infinite', async () => {
const data = freeTierData
server.use(
http.get('*/user/payment/balance', (req, res, ctx) => {
return HttpResponse.json(createUserPaymentBalanceResponse(data.balance))
}),
http.get('*/user/payment/subscriptions', (req, res, ctx) => {
return HttpResponse.json(createUserPaymentSubscriptionsResponse(data.subscriptions))
}),
http.get('*/org', (req, res, ctx) => {
return new HttpResponse(403)
}),
)
const billingActor = createActor(billingMachine, { input: BILLING_CONTEXT_DEFAULTS }).start()
const { queryByTestId } = render(<BillingDialog
billingActor={billingActor}
/>)
await act(() => {
billingActor.send({ type: BillingTransition.Update, apiToken: "it doesn't matter wtf this is :)" })
})
await expect(queryByTestId('billing-upgrade-button')).toBeVisible()
})
test('Hide upgrade button if credits are infinite', async () => {
const data = enterpriseTierData
server.use(
http.get('*/user/payment/balance', (req, res, ctx) => {
return HttpResponse.json(createUserPaymentBalanceResponse(data.balance))
}),
http.get('*/user/payment/subscriptions', (req, res, ctx) => {
return HttpResponse.json(createUserPaymentSubscriptionsResponse(data.subscriptions))
}),
// Ok finally the first use of an org lol
http.get('*/org', (req, res, ctx) => {
return HttpResponse.json(createOrgResponse())
}),
)
const billingActor = createActor(billingMachine, { input: BILLING_CONTEXT_DEFAULTS }).start()
const { queryByTestId } = render(<BillingDialog
billingActor={billingActor}
/>)
await act(() => {
billingActor.send({ type: BillingTransition.Update, apiToken: "it doesn't matter wtf this is :)" })
})
await expect(queryByTestId('billing-upgrade-button')).toBe(null)
})

1118
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -110,8 +110,11 @@
"remove-importmeta": "sed -i 's/import.meta.url/window.location.origin/g' \"./rust/kcl-wasm-lib/pkg/kcl_wasm_lib.js\"; sed -i '' 's/import.meta.url/window.location.origin/g' \"./rust/kcl-wasm-lib/pkg/kcl_wasm_lib.js\" || echo \"sed for both mac and linux\"", "remove-importmeta": "sed -i 's/import.meta.url/window.location.origin/g' \"./rust/kcl-wasm-lib/pkg/kcl_wasm_lib.js\"; sed -i '' 's/import.meta.url/window.location.origin/g' \"./rust/kcl-wasm-lib/pkg/kcl_wasm_lib.js\" || echo \"sed for both mac and linux\"",
"lint-fix": "eslint --fix --ext .ts --ext .tsx src e2e packages/codemirror-lsp-client/src rust/kcl-language-server/client/src", "lint-fix": "eslint --fix --ext .ts --ext .tsx src e2e packages/codemirror-lsp-client/src rust/kcl-language-server/client/src",
"lint": "eslint --max-warnings 0 --ext .ts --ext .tsx src e2e packages/codemirror-lsp-client/src rust/kcl-language-server/client/src", "lint": "eslint --max-warnings 0 --ext .ts --ext .tsx src e2e packages/codemirror-lsp-client/src rust/kcl-language-server/client/src",
"url-checker":"./scripts/url-checker.sh",
"url-checker:overwrite":"npm run url-checker > scripts/known/urls.txt",
"url-checker:diff":"./scripts/diff-url-checker.sh",
"circular-deps": "dpdm --no-warning --no-tree -T --skip-dynamic-imports=circular src/index.tsx", "circular-deps": "dpdm --no-warning --no-tree -T --skip-dynamic-imports=circular src/index.tsx",
"circular-deps:overwrite": "npm run circular-deps | sed '$d' | grep -v '^npm run' > known-circular.txt", "circular-deps:overwrite": "npm run circular-deps | sed '$d' | grep -v '^npm run' > scripts/known/circular.txt",
"circular-deps:diff": "./scripts/diff-circular-deps.sh", "circular-deps:diff": "./scripts/diff-circular-deps.sh",
"circular-deps:diff:nodejs": "npm run circular-deps:diff || node ./scripts/diff.js", "circular-deps:diff:nodejs": "npm run circular-deps:diff || node ./scripts/diff.js",
"files:set-version": "echo \"$(jq --arg v \"$VERSION\" '.version=$v' package.json --indent 2)\" > package.json", "files:set-version": "echo \"$(jq --arg v \"$VERSION\" '.version=$v' package.json --indent 2)\" > package.json",

View File

@ -29,7 +29,7 @@
"vscode-uri": "^3.1.0" "vscode-uri": "^3.1.0"
}, },
"devDependencies": { "devDependencies": {
"@types/node": "^22.14.1", "@types/node": "^24.0.7",
"ts-node": "^10.9.2" "ts-node": "^10.9.2"
} }
} }

Binary file not shown.

Before

Width:  |  Height:  |  Size: 84 KiB

After

Width:  |  Height:  |  Size: 84 KiB

144
rust/Cargo.lock generated
View File

@ -178,7 +178,7 @@ checksum = "3b43422f69d8ff38f95f1b2bb76517c91589a924d1559a0e935d7c8ce0274c11"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -189,7 +189,7 @@ checksum = "e539d3fca749fcee5236ab05e93a52867dd549cc157c8cb7f99595f3cedffdb5"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -211,7 +211,7 @@ checksum = "e12882f59de5360c748c4cbf569a042d5fb0eb515f7bea9c1f470b47f6ffbd73"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -514,7 +514,7 @@ dependencies = [
"heck", "heck",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -740,7 +740,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"strsim", "strsim",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -751,7 +751,7 @@ checksum = "d336a2a514f6ccccaa3e09b02d41d35330c07ddf03a62165fcec10bb561c7806"
dependencies = [ dependencies = [
"darling_core", "darling_core",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -810,7 +810,7 @@ checksum = "30542c1ad912e0e3d22a1935c290e12e8a29d704a420177a31faad4a601a0800"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -831,7 +831,7 @@ dependencies = [
"darling", "darling",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -841,7 +841,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab63b0e2bf4d5928aff72e83a7dace85d7bba5fe12dcc3c5a572d78caffd3f3c" checksum = "ab63b0e2bf4d5928aff72e83a7dace85d7bba5fe12dcc3c5a572d78caffd3f3c"
dependencies = [ dependencies = [
"derive_builder_core", "derive_builder_core",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -906,7 +906,7 @@ checksum = "97369cbbc041bc366949bc74d34658d6cda5621039731c6310521892a3a20ae0"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -944,7 +944,7 @@ checksum = "a1ab991c1362ac86c61ab6f556cff143daa22e5a15e4e189df818b2fd19fe65b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -1119,7 +1119,7 @@ checksum = "162ee34ebcb7c64a8abebc059ce0fee27c2262618d7b60ed8faf72fef13c3650"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -1223,7 +1223,7 @@ dependencies = [
"inflections", "inflections",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -1599,7 +1599,7 @@ checksum = "1ec89e9337638ecdc08744df490b221a7399bf8d164eb52a665454e60e075ad6"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -1814,7 +1814,7 @@ dependencies = [
[[package]] [[package]]
name = "kcl-bumper" name = "kcl-bumper"
version = "0.1.82" version = "0.1.83"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"clap", "clap",
@ -1825,26 +1825,26 @@ dependencies = [
[[package]] [[package]]
name = "kcl-derive-docs" name = "kcl-derive-docs"
version = "0.1.82" version = "0.1.83"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
name = "kcl-directory-test-macro" name = "kcl-directory-test-macro"
version = "0.1.82" version = "0.1.83"
dependencies = [ dependencies = [
"convert_case", "convert_case",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
name = "kcl-language-server" name = "kcl-language-server"
version = "0.2.82" version = "0.2.83"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"clap", "clap",
@ -1865,7 +1865,7 @@ dependencies = [
[[package]] [[package]]
name = "kcl-language-server-release" name = "kcl-language-server-release"
version = "0.1.82" version = "0.1.83"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"clap", "clap",
@ -1885,7 +1885,7 @@ dependencies = [
[[package]] [[package]]
name = "kcl-lib" name = "kcl-lib"
version = "0.2.82" version = "0.2.83"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"approx 0.5.1", "approx 0.5.1",
@ -1962,7 +1962,7 @@ dependencies = [
[[package]] [[package]]
name = "kcl-python-bindings" name = "kcl-python-bindings"
version = "0.3.82" version = "0.3.83"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"kcl-lib", "kcl-lib",
@ -1977,7 +1977,7 @@ dependencies = [
[[package]] [[package]]
name = "kcl-test-server" name = "kcl-test-server"
version = "0.1.82" version = "0.1.83"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"hyper 0.14.32", "hyper 0.14.32",
@ -1990,7 +1990,7 @@ dependencies = [
[[package]] [[package]]
name = "kcl-to-core" name = "kcl-to-core"
version = "0.1.82" version = "0.1.83"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"async-trait", "async-trait",
@ -2004,7 +2004,7 @@ dependencies = [
[[package]] [[package]]
name = "kcl-wasm-lib" name = "kcl-wasm-lib"
version = "0.1.82" version = "0.1.83"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"bson", "bson",
@ -2071,9 +2071,9 @@ dependencies = [
[[package]] [[package]]
name = "kittycad-modeling-cmds" name = "kittycad-modeling-cmds"
version = "0.2.124" version = "0.2.125"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "221aa4670a7ad7dc8f1e4e0f9990bf3cff0a64417eb76493bafe5bbbc1f8350a" checksum = "cfd09d95f8bbeb090d4d1137c9bf421eb75763f7a30e4a9e8eefa249ddf20bd3"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"chrono", "chrono",
@ -2104,7 +2104,7 @@ dependencies = [
"kittycad-modeling-cmds-macros-impl", "kittycad-modeling-cmds-macros-impl",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2115,7 +2115,7 @@ checksum = "fdb4ee23cc996aa2dca7584d410e8826e08161e1ac4335bb646d5ede33f37cb3"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2311,7 +2311,7 @@ checksum = "db5b29714e950dbb20d5e6f74f9dcec4edbcc1067bb7f8ed198c097b8c1a818b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2640,7 +2640,7 @@ dependencies = [
"regex", "regex",
"regex-syntax 0.8.5", "regex-syntax 0.8.5",
"structmeta", "structmeta",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2654,7 +2654,7 @@ dependencies = [
"regex", "regex",
"regex-syntax 0.8.5", "regex-syntax 0.8.5",
"structmeta", "structmeta",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2710,7 +2710,7 @@ dependencies = [
"pest_meta", "pest_meta",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2754,7 +2754,7 @@ dependencies = [
"phf_shared", "phf_shared",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2809,7 +2809,7 @@ checksum = "6e918e4ff8c4549eb882f14b3a4bc8c8bc93de829416eacf579f1207a8fbf861"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2921,7 +2921,7 @@ dependencies = [
"proc-macro-error-attr2", "proc-macro-error-attr2",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2981,7 +2981,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"pyo3-macros-backend", "pyo3-macros-backend",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -2994,7 +2994,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"pyo3-build-config", "pyo3-build-config",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -3492,7 +3492,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"serde_derive_internals", "serde_derive_internals",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -3556,7 +3556,7 @@ checksum = "5b0276cf7f2c73365f7157c8123c21cd9a50fbbd844757af28ca1f5925fc2a00"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -3567,7 +3567,7 @@ checksum = "18d26a20a969b9e3fdf2fc2d9f21eda6c40e2de84c9408bb5d3b05d499aae711"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -3591,14 +3591,14 @@ checksum = "175ee3e80ae9982737ca543e96133087cbd9a485eecc3bc4de9c1a37b47ea59c"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
name = "serde_spanned" name = "serde_spanned"
version = "0.6.8" version = "0.6.9"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "87607cb1398ed59d48732e575a4c28a7a8ebf2454b964fe3f224f2afc07909e1" checksum = "bf41e0cfaf7226dca15e8197172c295a782857fcb97fad1808a166870dee75a3"
dependencies = [ dependencies = [
"serde", "serde",
] ]
@ -3815,7 +3815,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"structmeta-derive", "structmeta-derive",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -3826,7 +3826,7 @@ checksum = "152a0b65a590ff6c3da95cabe2353ee04e6167c896b28e3b14478c2636c922fc"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -3848,7 +3848,7 @@ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"rustversion", "rustversion",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -3891,9 +3891,9 @@ dependencies = [
[[package]] [[package]]
name = "syn" name = "syn"
version = "2.0.103" version = "2.0.104"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e4307e30089d6fd6aff212f2da3a1f9e32f3223b1f010fb09b7c95f90f3ca1e8" checksum = "17b6f705963418cdb9927482fa304bc562ece2fdd4f616084c50b7023b435a40"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -3917,7 +3917,7 @@ checksum = "c8af7666ab7b6390ab78131fb5b0fce11d6b7a6951602017c35fa82800708971"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -3941,7 +3941,7 @@ dependencies = [
"proc-macro-error2", "proc-macro-error2",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -4050,7 +4050,7 @@ checksum = "4fee6c4efc90059e10f81e6d42c60a18f76588c3d74cb83a0b242a2b6c7504c1"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -4061,7 +4061,7 @@ checksum = "7f7cf42b4507d8ea322120659672cf1b9dbb93f8f2d4ecfd6e51350ff5b17a1d"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -4173,7 +4173,7 @@ checksum = "6e06d43f1345a3bcd39f6a56dbb7dcab2ba47e68e8ac134855e7e2bdbaf8cab8"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -4217,9 +4217,9 @@ dependencies = [
[[package]] [[package]]
name = "toml" name = "toml"
version = "0.8.22" version = "0.8.23"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "05ae329d1f08c4d17a59bed7ff5b5a769d062e64a62d34a3261b219e62cd5aae" checksum = "dc1beb996b9d83529a9e75c17a1686767d148d70663143c7854d8b4a09ced362"
dependencies = [ dependencies = [
"serde", "serde",
"serde_spanned", "serde_spanned",
@ -4238,9 +4238,9 @@ dependencies = [
[[package]] [[package]]
name = "toml_edit" name = "toml_edit"
version = "0.22.26" version = "0.22.27"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "310068873db2c5b3e7659d2cc35d21855dbafa50d1ce336397c666e3cb08137e" checksum = "41fe8c660ae4257887cf66394862d21dbca4a6ddd26f04a3560410406a2f819a"
dependencies = [ dependencies = [
"indexmap 2.9.0", "indexmap 2.9.0",
"serde", "serde",
@ -4341,7 +4341,7 @@ checksum = "84fd902d4e0b9a4b27f2f440108dc034e1758628a9b702f8ec61ad66355422fa"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -4369,7 +4369,7 @@ checksum = "395ae124c09f9e6918a2310af6038fba074bcf474ac352496d5910dd59a2226d"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -4449,7 +4449,7 @@ checksum = "e9d4ed7b4c18cc150a6a0a1e9ea1ecfa688791220781af6e119f9599a8502a0a"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
"termcolor", "termcolor",
] ]
@ -4635,7 +4635,7 @@ dependencies = [
"proc-macro-error2", "proc-macro-error2",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -4706,7 +4706,7 @@ dependencies = [
"log", "log",
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
"wasm-bindgen-shared", "wasm-bindgen-shared",
] ]
@ -4742,7 +4742,7 @@ checksum = "8ae87ea40c9f689fc23f209965b6fb8a99ad69aeeb0231408be24920604395de"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
"wasm-bindgen-backend", "wasm-bindgen-backend",
"wasm-bindgen-shared", "wasm-bindgen-shared",
] ]
@ -4777,7 +4777,7 @@ checksum = "17d5042cc5fa009658f9a7333ef24291b1291a25b6382dd68862a7f3b969f69b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -5067,7 +5067,7 @@ checksum = "2380878cad4ac9aac1e2435f3eb4020e8374b5f13c296cb75b4620ff8e229154"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
"synstructure", "synstructure",
] ]
@ -5112,7 +5112,7 @@ checksum = "fa4f8080344d4671fb4e831a13ad1e68092748387dfc4f55e356242fae12ce3e"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -5123,7 +5123,7 @@ checksum = "6352c01d0edd5db859a63e2605f4ea3183ddbd15e2c4a9e7d32184df75e4f154"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -5143,7 +5143,7 @@ checksum = "d71e5d6e06ab090c67b5e44993ec16b72dcbaabc526db883a360057678b48502"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
"synstructure", "synstructure",
] ]
@ -5164,7 +5164,7 @@ checksum = "ce36e65b0d2999d2aafac989fb249189a141aee1f53c612c1f37d72631959f69"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]
@ -5186,7 +5186,7 @@ checksum = "6eafa6dfb17584ea3e2bd6e76e0cc15ad7af12b09abdd1ca55961bed9b1063c6"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
"syn 2.0.103", "syn 2.0.104",
] ]
[[package]] [[package]]

View File

@ -36,7 +36,7 @@ dashmap = { version = "6.1.0" }
http = "1" http = "1"
indexmap = "2.9.0" indexmap = "2.9.0"
kittycad = { version = "0.3.37", default-features = false, features = ["js", "requests"] } kittycad = { version = "0.3.37", default-features = false, features = ["js", "requests"] }
kittycad-modeling-cmds = { version = "0.2.124", features = ["ts-rs", "websocket"] } kittycad-modeling-cmds = { version = "0.2.125", features = ["ts-rs", "websocket"] }
lazy_static = "1.5.0" lazy_static = "1.5.0"
miette = "7.6.0" miette = "7.6.0"
pyo3 = { version = "0.24.2" } pyo3 = { version = "0.24.2" }
@ -60,6 +60,6 @@ lossy_float_literal = "warn"
result_large_err = "allow" result_large_err = "allow"
# Example: how to point modeling-app at a different repo (e.g. a branch or a local clone) # Example: how to point modeling-app at a different repo (e.g. a branch or a local clone)
#[patch.crates-io] # [patch.crates-io]
#kittycad-modeling-cmds = { path = "../../../modeling-api/modeling-cmds" } # kittycad-modeling-cmds = { path = "../../modeling-api/modeling-cmds/" }
#kittycad-modeling-session = { path = "../../../modeling-api/modeling-session" } # kittycad-modeling-session = { path = "../../modeling-api/modeling-session" }

View File

@ -8,6 +8,9 @@ lint:
# Ensure we can build without extra feature flags. # Ensure we can build without extra feature flags.
cargo clippy -p kcl-lib --all-targets -- -D warnings cargo clippy -p kcl-lib --all-targets -- -D warnings
lint-fix:
cargo clippy --workspace --all-targets --all-features --fix
# Run the stdlib docs generation # Run the stdlib docs generation
redo-kcl-stdlib-docs-no-imgs: redo-kcl-stdlib-docs-no-imgs:
EXPECTORATE=overwrite {{cnr}} {{kcl_lib_flags}} docs::gen_std_tests::test_generate_stdlib EXPECTORATE=overwrite {{cnr}} {{kcl_lib_flags}} docs::gen_std_tests::test_generate_stdlib

View File

@ -1,7 +1,7 @@
[package] [package]
name = "kcl-bumper" name = "kcl-bumper"
version = "0.1.82" version = "0.1.83"
edition = "2021" edition = "2021"
repository = "https://github.com/KittyCAD/modeling-api" repository = "https://github.com/KittyCAD/modeling-api"
rust-version = "1.76" rust-version = "1.76"
@ -19,7 +19,7 @@ anyhow = { workspace = true }
clap = { workspace = true, features = ["derive"] } clap = { workspace = true, features = ["derive"] }
semver = "1.0.25" semver = "1.0.25"
serde = { workspace = true } serde = { workspace = true }
toml_edit = "0.22.26" toml_edit = "0.22.27"
[lints] [lints]
workspace = true workspace = true

View File

@ -1,7 +1,7 @@
[package] [package]
name = "kcl-derive-docs" name = "kcl-derive-docs"
description = "A tool for generating documentation from Rust derive macros" description = "A tool for generating documentation from Rust derive macros"
version = "0.1.82" version = "0.1.83"
edition = "2021" edition = "2021"
license = "MIT" license = "MIT"
repository = "https://github.com/KittyCAD/modeling-app" repository = "https://github.com/KittyCAD/modeling-app"
@ -14,7 +14,7 @@ bench = false
[dependencies] [dependencies]
proc-macro2 = "1" proc-macro2 = "1"
quote = "1" quote = "1"
syn = { version = "2.0.103", features = ["full"] } syn = { version = "2.0.104", features = ["full"] }
[lints] [lints]
workspace = true workspace = true

View File

@ -97,8 +97,11 @@ pub const TEST_NAMES: &[&str] = &[
"std-offsetPlane-2", "std-offsetPlane-2",
"std-offsetPlane-3", "std-offsetPlane-3",
"std-offsetPlane-4", "std-offsetPlane-4",
"std-sketch-planeOf-0",
"std-sketch-circle-0", "std-sketch-circle-0",
"std-sketch-circle-1", "std-sketch-circle-1",
"std-sketch-rectangle-0",
"std-sketch-rectangle-1",
"std-sketch-patternTransform2d-0", "std-sketch-patternTransform2d-0",
"std-sketch-revolve-0", "std-sketch-revolve-0",
"std-sketch-revolve-1", "std-sketch-revolve-1",

View File

@ -1,7 +1,7 @@
[package] [package]
name = "kcl-directory-test-macro" name = "kcl-directory-test-macro"
description = "A tool for generating tests from a directory of kcl files" description = "A tool for generating tests from a directory of kcl files"
version = "0.1.82" version = "0.1.83"
edition = "2021" edition = "2021"
license = "MIT" license = "MIT"
repository = "https://github.com/KittyCAD/modeling-app" repository = "https://github.com/KittyCAD/modeling-app"
@ -14,7 +14,7 @@ bench = false
convert_case = "0.8.0" convert_case = "0.8.0"
proc-macro2 = "1" proc-macro2 = "1"
quote = "1" quote = "1"
syn = { version = "2.0.103", features = ["full"] } syn = { version = "2.0.104", features = ["full"] }
[lints] [lints]
workspace = true workspace = true

View File

@ -1,6 +1,6 @@
[package] [package]
name = "kcl-language-server-release" name = "kcl-language-server-release"
version = "0.1.82" version = "0.1.83"
edition = "2021" edition = "2021"
authors = ["KittyCAD Inc <kcl@kittycad.io>"] authors = ["KittyCAD Inc <kcl@kittycad.io>"]
publish = false publish = false

View File

@ -42,7 +42,7 @@ impl Build {
.to_string(); .to_string();
if !stable { if !stable {
version = format!("{}-nightly", version); version = format!("{version}-nightly");
} }
let release_tag = if stable { let release_tag = if stable {
@ -59,10 +59,7 @@ impl Build {
if stable && !release_tag.contains(&version) { if stable && !release_tag.contains(&version) {
// bail early if the tag doesn't match the version // bail early if the tag doesn't match the version
// TODO: error here when we use the tags with kcl // TODO: error here when we use the tags with kcl
println!( println!("Tag {release_tag} doesn't match version {version}. Did you forget to update Cargo.toml?");
"Tag {} doesn't match version {}. Did you forget to update Cargo.toml?",
release_tag, version
);
} }
build_server(sh, &version, &target)?; build_server(sh, &version, &target)?;

View File

@ -95,10 +95,10 @@ async fn main() -> Result<()> {
// Format fields using the provided closure. // Format fields using the provided closure.
// We want to make this very concise otherwise the logs are not able to be read by humans. // We want to make this very concise otherwise the logs are not able to be read by humans.
let format = tracing_subscriber::fmt::format::debug_fn(|writer, field, value| { let format = tracing_subscriber::fmt::format::debug_fn(|writer, field, value| {
if format!("{}", field) == "message" { if format!("{field}") == "message" {
write!(writer, "{}: {:?}", field, value) write!(writer, "{field}: {value:?}")
} else { } else {
write!(writer, "{}", field) write!(writer, "{field}")
} }
}) })
// Separate each field with a comma. // Separate each field with a comma.

View File

@ -2,7 +2,7 @@
name = "kcl-language-server" name = "kcl-language-server"
description = "A language server for KCL." description = "A language server for KCL."
authors = ["KittyCAD Inc <kcl@kittycad.io>"] authors = ["KittyCAD Inc <kcl@kittycad.io>"]
version = "0.2.82" version = "0.2.83"
edition = "2021" edition = "2021"
license = "MIT" license = "MIT"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

View File

@ -123,7 +123,7 @@
"@vscode/test-electron": "^2.4.1", "@vscode/test-electron": "^2.4.1",
"@vscode/vsce": "^3.3.2", "@vscode/vsce": "^3.3.2",
"cross-env": "^7.0.3", "cross-env": "^7.0.3",
"esbuild": "^0.25.2", "esbuild": "^0.25.3",
"glob": "^11.0.1", "glob": "^11.0.1",
"mocha": "^11.1.0", "mocha": "^11.1.0",
"typescript": "^5.8.3" "typescript": "^5.8.3"

View File

@ -87,10 +87,10 @@ async fn main() -> Result<()> {
// Format fields using the provided closure. // Format fields using the provided closure.
// We want to make this very concise otherwise the logs are not able to be read by humans. // We want to make this very concise otherwise the logs are not able to be read by humans.
let format = tracing_subscriber::fmt::format::debug_fn(|writer, field, value| { let format = tracing_subscriber::fmt::format::debug_fn(|writer, field, value| {
if format!("{}", field) == "message" { if format!("{field}") == "message" {
write!(writer, "{}: {:?}", field, value) write!(writer, "{field}: {value:?}")
} else { } else {
write!(writer, "{}", field) write!(writer, "{field}")
} }
}) })
// Separate each field with a comma. // Separate each field with a comma.
@ -151,7 +151,7 @@ async fn run_cmd(opts: &Opts) -> Result<()> {
tokio::spawn(async move { tokio::spawn(async move {
if let Some(sig) = signals.forever().next() { if let Some(sig) = signals.forever().next() {
log::info!("received signal: {:?}", sig); log::info!("received signal: {sig:?}");
log::info!("triggering cleanup..."); log::info!("triggering cleanup...");
// Exit the process. // Exit the process.

View File

@ -1,11 +1,11 @@
[package] [package]
name = "kcl-lib" name = "kcl-lib"
description = "KittyCAD Language implementation and tools" description = "KittyCAD Language implementation and tools"
version = "0.2.82" version = "0.2.83"
edition = "2021" edition = "2024"
license = "MIT" license = "MIT"
repository = "https://github.com/KittyCAD/modeling-app" repository = "https://github.com/KittyCAD/modeling-app"
rust-version = "1.83" rust-version = "1.88"
authors = ["Jess Frazelle", "Adam Chalmers", "KittyCAD, Inc"] authors = ["Jess Frazelle", "Adam Chalmers", "KittyCAD, Inc"]
keywords = ["kcl", "KittyCAD", "CAD"] keywords = ["kcl", "KittyCAD", "CAD"]
exclude = ["tests/*", "benches/*", "examples/*", "e2e/*", "bindings/*", "fuzz/*"] exclude = ["tests/*", "benches/*", "examples/*", "e2e/*", "bindings/*", "fuzz/*"]
@ -74,7 +74,7 @@ sha2 = "0.10.9"
tabled = { version = "0.20.0", optional = true } tabled = { version = "0.20.0", optional = true }
tempfile = "3.20" tempfile = "3.20"
thiserror = "2.0.0" thiserror = "2.0.0"
toml = "0.8.22" toml = "0.8.23"
ts-rs = { version = "11.0.1", features = [ ts-rs = { version = "11.0.1", features = [
"uuid-impl", "uuid-impl",
"url-impl", "url-impl",

View File

@ -4,7 +4,7 @@ use std::{
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
use criterion::{criterion_group, criterion_main, Criterion}; use criterion::{Criterion, criterion_group, criterion_main};
const IGNORE_DIRS: [&str; 2] = ["step", "screenshots"]; const IGNORE_DIRS: [&str; 2] = ["step", "screenshots"];
@ -61,7 +61,7 @@ fn run_benchmarks(c: &mut Criterion) {
// Read the file content (panic on failure) // Read the file content (panic on failure)
let input_content = fs::read_to_string(&input_file) let input_content = fs::read_to_string(&input_file)
.unwrap_or_else(|e| panic!("Failed to read main.kcl in directory {}: {}", dir_name, e)); .unwrap_or_else(|e| panic!("Failed to read main.kcl in directory {dir_name}: {e}"));
// Create a benchmark group for this directory // Create a benchmark group for this directory
let mut group = c.benchmark_group(&dir_name); let mut group = c.benchmark_group(&dir_name);
@ -72,12 +72,12 @@ fn run_benchmarks(c: &mut Criterion) {
#[cfg(feature = "benchmark-execution")] #[cfg(feature = "benchmark-execution")]
let program = kcl_lib::Program::parse_no_errs(&input_content).unwrap(); let program = kcl_lib::Program::parse_no_errs(&input_content).unwrap();
group.bench_function(format!("parse_{}", dir_name), |b| { group.bench_function(format!("parse_{dir_name}"), |b| {
b.iter(|| kcl_lib::Program::parse_no_errs(black_box(&input_content)).unwrap()) b.iter(|| kcl_lib::Program::parse_no_errs(black_box(&input_content)).unwrap())
}); });
#[cfg(feature = "benchmark-execution")] #[cfg(feature = "benchmark-execution")]
group.bench_function(format!("execute_{}", dir_name), |b| { group.bench_function(format!("execute_{dir_name}"), |b| {
b.iter(|| { b.iter(|| {
if let Err(err) = rt.block_on(async { if let Err(err) = rt.block_on(async {
let ctx = kcl_lib::ExecutorContext::new_with_default_client().await?; let ctx = kcl_lib::ExecutorContext::new_with_default_client().await?;
@ -86,7 +86,7 @@ fn run_benchmarks(c: &mut Criterion) {
ctx.close().await; ctx.close().await;
Ok::<(), anyhow::Error>(()) Ok::<(), anyhow::Error>(())
}) { }) {
panic!("Failed to execute program: {}", err); panic!("Failed to execute program: {err}");
} }
}) })
}); });

View File

@ -1,6 +1,6 @@
use std::hint::black_box; use std::hint::black_box;
use criterion::{criterion_group, criterion_main, Criterion}; use criterion::{Criterion, criterion_group, criterion_main};
pub fn bench_parse(c: &mut Criterion) { pub fn bench_parse(c: &mut Criterion) {
for (name, file) in [ for (name, file) in [

View File

@ -1,4 +1,4 @@
use criterion::{criterion_group, criterion_main, Criterion}; use criterion::{Criterion, criterion_group, criterion_main};
pub fn bench_digest(c: &mut Criterion) { pub fn bench_digest(c: &mut Criterion) {
for (name, file) in [ for (name, file) in [

View File

@ -1,6 +1,6 @@
use std::hint::black_box; use std::hint::black_box;
use criterion::{criterion_group, criterion_main, BenchmarkId, Criterion}; use criterion::{BenchmarkId, Criterion, criterion_group, criterion_main};
use kcl_lib::kcl_lsp_server; use kcl_lib::kcl_lsp_server;
use tokio::runtime::Runtime; use tokio::runtime::Runtime;
use tower_lsp::LanguageServer; use tower_lsp::LanguageServer;

View File

@ -1,9 +1,9 @@
//! Cache testing framework. //! Cache testing framework.
use kcl_lib::{bust_cache, ExecError, ExecOutcome}; use kcl_lib::{ExecError, ExecOutcome, bust_cache};
#[cfg(feature = "artifact-graph")] #[cfg(feature = "artifact-graph")]
use kcl_lib::{exec::Operation, NodePathStep}; use kcl_lib::{NodePathStep, exec::Operation};
use kcmc::{each_cmd as mcmd, ModelingCmd}; use kcmc::{ModelingCmd, each_cmd as mcmd};
use kittycad_modeling_cmds as kcmc; use kittycad_modeling_cmds as kcmc;
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
@ -38,7 +38,7 @@ async fn cache_test(
if !variation.other_files.is_empty() { if !variation.other_files.is_empty() {
let tmp_dir = std::env::temp_dir(); let tmp_dir = std::env::temp_dir();
let tmp_dir = tmp_dir let tmp_dir = tmp_dir
.join(format!("kcl_test_{}", test_name)) .join(format!("kcl_test_{test_name}"))
.join(uuid::Uuid::new_v4().to_string()); .join(uuid::Uuid::new_v4().to_string());
// Create a temporary file for each of the other files. // Create a temporary file for each of the other files.
@ -56,7 +56,7 @@ async fn cache_test(
Err(error) => { Err(error) => {
let report = error.clone().into_miette_report_with_outputs(variation.code).unwrap(); let report = error.clone().into_miette_report_with_outputs(variation.code).unwrap();
let report = miette::Report::new(report); let report = miette::Report::new(report);
panic!("{:?}", report); panic!("{report:?}");
} }
}; };
@ -69,7 +69,7 @@ async fn cache_test(
.and_then(|x| x.decode().map_err(|e| ExecError::BadPng(e.to_string()))) .and_then(|x| x.decode().map_err(|e| ExecError::BadPng(e.to_string())))
.unwrap(); .unwrap();
// Save the snapshot. // Save the snapshot.
let path = crate::assert_out(&format!("cache_{}_{}", test_name, index), &img); let path = crate::assert_out(&format!("cache_{test_name}_{index}"), &img);
img_results.push((path, img, outcome)); img_results.push((path, img, outcome));
} }
@ -337,8 +337,7 @@ extrude001 = extrude(profile001, length = 4)
// 0] as a more lenient check. // 0] as a more lenient check.
.map(|c| !c.range.is_synthetic() && c.node_path.is_empty()) .map(|c| !c.range.is_synthetic() && c.node_path.is_empty())
.unwrap_or(false), .unwrap_or(false),
"artifact={:?}", "artifact={artifact:?}"
artifact
); );
} }
} }

View File

@ -1,8 +1,8 @@
mod cache; mod cache;
use kcl_lib::{ use kcl_lib::{
test_server::{execute_and_export_step, execute_and_snapshot, execute_and_snapshot_no_auth},
BacktraceItem, ExecError, ModuleId, SourceRange, BacktraceItem, ExecError, ModuleId, SourceRange,
test_server::{execute_and_export_step, execute_and_snapshot, execute_and_snapshot_no_auth},
}; };
/// The minimum permissible difference between asserted twenty-twenty images. /// The minimum permissible difference between asserted twenty-twenty images.
@ -869,11 +869,13 @@ async fn kcl_test_revolve_bad_angle_low() {
let result = execute_and_snapshot(code, None).await; let result = execute_and_snapshot(code, None).await;
assert!(result.is_err()); assert!(result.is_err());
assert!(result assert!(
.err() result
.unwrap() .err()
.to_string() .unwrap()
.contains("Expected angle to be between -360 and 360 and not 0, found `-455`")); .to_string()
.contains("Expected angle to be between -360 and 360 and not 0, found `-455`")
);
} }
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
@ -895,11 +897,13 @@ async fn kcl_test_revolve_bad_angle_high() {
let result = execute_and_snapshot(code, None).await; let result = execute_and_snapshot(code, None).await;
assert!(result.is_err()); assert!(result.is_err());
assert!(result assert!(
.err() result
.unwrap() .err()
.to_string() .unwrap()
.contains("Expected angle to be between -360 and 360 and not 0, found `455`")); .to_string()
.contains("Expected angle to be between -360 and 360 and not 0, found `455`")
);
} }
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
@ -2090,7 +2094,10 @@ async fn kcl_test_better_type_names() {
}, },
None => todo!(), None => todo!(),
}; };
assert_eq!(err, "This function expected the input argument to be one or more Solids or ImportedGeometry but it's actually of type Sketch. You can convert a sketch (2D) into a Solid (3D) by calling a function like `extrude` or `revolve`"); assert_eq!(
err,
"This function expected the input argument to be one or more Solids or ImportedGeometry but it's actually of type Sketch. You can convert a sketch (2D) into a Solid (3D) by calling a function like `extrude` or `revolve`"
);
} }
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]

View File

@ -101,7 +101,7 @@ pub trait CoreDump: Clone {
.meta() .meta()
.create_debug_uploads(vec![kittycad::types::multipart::Attachment { .create_debug_uploads(vec![kittycad::types::multipart::Attachment {
name: "".to_string(), name: "".to_string(),
filepath: Some(format!(r#"modeling-app/coredump-{}.json"#, coredump_id).into()), filepath: Some(format!(r#"modeling-app/coredump-{coredump_id}.json"#).into()),
content_type: Some("application/json".to_string()), content_type: Some("application/json".to_string()),
data, data,
}]) }])

View File

@ -189,7 +189,7 @@ fn generate_example(index: usize, src: &str, props: &ExampleProperties, file_nam
index index
); );
let image_data = let image_data =
std::fs::read(&image_path).unwrap_or_else(|_| panic!("Failed to read image file: {}", image_path)); std::fs::read(&image_path).unwrap_or_else(|_| panic!("Failed to read image file: {image_path}"));
base64::engine::general_purpose::STANDARD.encode(&image_data) base64::engine::general_purpose::STANDARD.encode(&image_data)
}; };
@ -225,7 +225,7 @@ fn generate_type_from_kcl(ty: &TyData, file_name: String, example_name: String,
let output = hbs.render("kclType", &data)?; let output = hbs.render("kclType", &data)?;
let output = cleanup_types(&output, kcl_std); let output = cleanup_types(&output, kcl_std);
expectorate::assert_contents(format!("../../docs/kcl-std/{}.md", file_name), &output); expectorate::assert_contents(format!("../../docs/kcl-std/{file_name}.md"), &output);
Ok(()) Ok(())
} }
@ -267,7 +267,7 @@ fn generate_mod_from_kcl(m: &ModData, file_name: String) -> Result<()> {
}); });
let output = hbs.render("module", &data)?; let output = hbs.render("module", &data)?;
expectorate::assert_contents(format!("../../docs/kcl-std/{}.md", file_name), &output); expectorate::assert_contents(format!("../../docs/kcl-std/{file_name}.md"), &output);
Ok(()) Ok(())
} }
@ -334,7 +334,7 @@ fn generate_function_from_kcl(
let output = hbs.render("function", &data)?; let output = hbs.render("function", &data)?;
let output = &cleanup_types(&output, kcl_std); let output = &cleanup_types(&output, kcl_std);
expectorate::assert_contents(format!("../../docs/kcl-std/{}.md", file_name), output); expectorate::assert_contents(format!("../../docs/kcl-std/{file_name}.md"), output);
Ok(()) Ok(())
} }
@ -378,7 +378,7 @@ fn generate_const_from_kcl(cnst: &ConstData, file_name: String, example_name: St
let output = hbs.render("const", &data)?; let output = hbs.render("const", &data)?;
let output = cleanup_types(&output, kcl_std); let output = cleanup_types(&output, kcl_std);
expectorate::assert_contents(format!("../../docs/kcl-std/{}.md", file_name), &output); expectorate::assert_contents(format!("../../docs/kcl-std/{file_name}.md"), &output);
Ok(()) Ok(())
} }

View File

@ -8,6 +8,7 @@ use tower_lsp::lsp_types::{
}; };
use crate::{ use crate::{
ModuleId,
execution::annotations, execution::annotations,
parsing::{ parsing::{
ast::types::{ ast::types::{
@ -15,7 +16,6 @@ use crate::{
}, },
token::NumericSuffix, token::NumericSuffix,
}, },
ModuleId,
}; };
pub fn walk_prelude() -> ModData { pub fn walk_prelude() -> ModData {
@ -97,7 +97,7 @@ fn visit_module(name: &str, preferred_prefix: &str, names: WalkForNames) -> Resu
ImportSelector::None { .. } => { ImportSelector::None { .. } => {
let name = import.module_name().unwrap(); let name = import.module_name().unwrap();
if names.contains(&name) { if names.contains(&name) {
Some(visit_module(&path[1], &format!("{}::", name), WalkForNames::All)?) Some(visit_module(&path[1], &format!("{name}::"), WalkForNames::All)?)
} else { } else {
None None
} }
@ -451,7 +451,7 @@ impl ModData {
let (name, qual_name, module_name) = if name == "prelude" { let (name, qual_name, module_name) = if name == "prelude" {
("std", "std".to_owned(), String::new()) ("std", "std".to_owned(), String::new())
} else { } else {
(name, format!("std::{}", name), "std".to_owned()) (name, format!("std::{name}"), "std".to_owned())
}; };
Self { Self {
preferred_name: format!("{preferred_prefix}{name}"), preferred_name: format!("{preferred_prefix}{name}"),
@ -767,14 +767,12 @@ impl ArgData {
for s in &arr.elements { for s in &arr.elements {
let Expr::Literal(lit) = s else { let Expr::Literal(lit) = s else {
panic!( panic!(
"Invalid value in `snippetArray`, all items must be string literals but found {:?}", "Invalid value in `snippetArray`, all items must be string literals but found {s:?}"
s
); );
}; };
let LiteralValue::String(litstr) = &lit.inner.value else { let LiteralValue::String(litstr) = &lit.inner.value else {
panic!( panic!(
"Invalid value in `snippetArray`, all items must be string literals but found {:?}", "Invalid value in `snippetArray`, all items must be string literals but found {s:?}"
s
); );
}; };
items.push(litstr.to_owned()); items.push(litstr.to_owned());
@ -816,7 +814,7 @@ impl ArgData {
} }
match self.ty.as_deref() { match self.ty.as_deref() {
Some("Sketch") if self.kind == ArgKind::Special => None, Some("Sketch") if self.kind == ArgKind::Special => None,
Some(s) if s.starts_with("number") => Some((index, format!(r#"{label}${{{}:10}}"#, index))), Some(s) if s.starts_with("number") => Some((index, format!(r#"{label}${{{index}:10}}"#))),
Some("Point2d") => Some((index + 1, format!(r#"{label}[${{{}:0}}, ${{{}:0}}]"#, index, index + 1))), Some("Point2d") => Some((index + 1, format!(r#"{label}[${{{}:0}}, ${{{}:0}}]"#, index, index + 1))),
Some("Point3d") => Some(( Some("Point3d") => Some((
index + 2, index + 2,
@ -831,7 +829,7 @@ impl ArgData {
Some("Sketch") | Some("Sketch | Helix") => Some((index, format!(r#"{label}${{{index}:sketch000}}"#))), Some("Sketch") | Some("Sketch | Helix") => Some((index, format!(r#"{label}${{{index}:sketch000}}"#))),
Some("Edge") => Some((index, format!(r#"{label}${{{index}:tag_or_edge_fn}}"#))), Some("Edge") => Some((index, format!(r#"{label}${{{index}:tag_or_edge_fn}}"#))),
Some("[Edge; 1+]") => Some((index, format!(r#"{label}[${{{index}:tag_or_edge_fn}}]"#))), Some("[Edge; 1+]") => Some((index, format!(r#"{label}[${{{index}:tag_or_edge_fn}}]"#))),
Some("Plane") | Some("Solid | Plane") => Some((index, format!(r#"{label}${{{}:XY}}"#, index))), Some("Plane") | Some("Solid | Plane") => Some((index, format!(r#"{label}${{{index}:XY}}"#))),
Some("[TaggedFace; 2]") => Some(( Some("[TaggedFace; 2]") => Some((
index + 1, index + 1,
format!(r#"{label}[${{{}:tag}}, ${{{}:tag}}]"#, index, index + 1), format!(r#"{label}[${{{}:tag}}, ${{{}:tag}}]"#, index, index + 1),
@ -841,10 +839,10 @@ impl ArgData {
if self.name == "color" { if self.name == "color" {
Some((index, format!(r"{label}${{{}:{}}}", index, "\"#ff0000\""))) Some((index, format!(r"{label}${{{}:{}}}", index, "\"#ff0000\"")))
} else { } else {
Some((index, format!(r#"{label}${{{}:"string"}}"#, index))) Some((index, format!(r#"{label}${{{index}:"string"}}"#)))
} }
} }
Some("bool") => Some((index, format!(r#"{label}${{{}:false}}"#, index))), Some("bool") => Some((index, format!(r#"{label}${{{index}:false}}"#))),
_ => None, _ => None,
} }
} }
@ -1298,7 +1296,10 @@ mod test {
continue; continue;
} }
let name = format!("{}-{i}", f.qual_name.replace("::", "-")); let name = format!("{}-{i}", f.qual_name.replace("::", "-"));
assert!(TEST_NAMES.contains(&&*name), "Missing test for example \"{name}\", maybe need to update kcl-derive-docs/src/example_tests.rs?") assert!(
TEST_NAMES.contains(&&*name),
"Missing test for example \"{name}\", maybe need to update kcl-derive-docs/src/example_tests.rs?"
)
} }
} }
} }
@ -1334,7 +1335,9 @@ mod test {
}; };
let Some(DocData::Fn(d)) = data.children.get(&format!("I:{qualname}")) else { let Some(DocData::Fn(d)) = data.children.get(&format!("I:{qualname}")) else {
panic!("Could not find data for {NAME} (missing a child entry for {qualname}), maybe need to update kcl-derive-docs/src/example_tests.rs?"); panic!(
"Could not find data for {NAME} (missing a child entry for {qualname}), maybe need to update kcl-derive-docs/src/example_tests.rs?"
);
}; };
for (i, eg) in d.examples.iter().enumerate() { for (i, eg) in d.examples.iter().enumerate() {
@ -1362,6 +1365,8 @@ mod test {
return; return;
} }
panic!("Could not find data for {NAME} (no example {number}), maybe need to update kcl-derive-docs/src/example_tests.rs?"); panic!(
"Could not find data for {NAME} (no example {number}), maybe need to update kcl-derive-docs/src/example_tests.rs?"
);
} }
} }

View File

@ -2,11 +2,11 @@
//! tasks. //! tasks.
use std::sync::{ use std::sync::{
atomic::{AtomicUsize, Ordering},
Arc, Arc,
atomic::{AtomicUsize, Ordering},
}; };
use tokio::sync::{mpsc, Notify}; use tokio::sync::{Notify, mpsc};
use crate::errors::KclError; use crate::errors::KclError;

View File

@ -3,26 +3,26 @@
use std::{collections::HashMap, sync::Arc}; use std::{collections::HashMap, sync::Arc};
use anyhow::{anyhow, Result}; use anyhow::{Result, anyhow};
use futures::{SinkExt, StreamExt}; use futures::{SinkExt, StreamExt};
use indexmap::IndexMap; use indexmap::IndexMap;
use kcmc::{ use kcmc::{
ModelingCmd,
websocket::{ websocket::{
BatchResponse, FailureWebSocketResponse, ModelingCmdReq, ModelingSessionData, OkWebSocketResponseData, BatchResponse, FailureWebSocketResponse, ModelingCmdReq, ModelingSessionData, OkWebSocketResponseData,
SuccessWebSocketResponse, WebSocketRequest, WebSocketResponse, SuccessWebSocketResponse, WebSocketRequest, WebSocketResponse,
}, },
ModelingCmd,
}; };
use kittycad_modeling_cmds::{self as kcmc}; use kittycad_modeling_cmds::{self as kcmc};
use tokio::sync::{mpsc, oneshot, RwLock}; use tokio::sync::{RwLock, mpsc, oneshot};
use tokio_tungstenite::tungstenite::Message as WsMsg; use tokio_tungstenite::tungstenite::Message as WsMsg;
use uuid::Uuid; use uuid::Uuid;
use crate::{ use crate::{
SourceRange,
engine::{AsyncTasks, EngineManager, EngineStats}, engine::{AsyncTasks, EngineManager, EngineStats},
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::{DefaultPlanes, IdGenerator}, execution::{DefaultPlanes, IdGenerator},
SourceRange,
}; };
#[derive(Debug, PartialEq)] #[derive(Debug, PartialEq)]
@ -85,7 +85,7 @@ impl TcpRead {
let msg = match msg { let msg = match msg {
Ok(msg) => msg, Ok(msg) => msg,
Err(e) if matches!(e, tokio_tungstenite::tungstenite::Error::Protocol(_)) => { Err(e) if matches!(e, tokio_tungstenite::tungstenite::Error::Protocol(_)) => {
return Err(WebSocketReadError::Read(e)) return Err(WebSocketReadError::Read(e));
} }
Err(e) => return Err(anyhow::anyhow!("Error reading from engine's WebSocket: {e}").into()), Err(e) => return Err(anyhow::anyhow!("Error reading from engine's WebSocket: {e}").into()),
}; };
@ -427,7 +427,7 @@ impl EngineManager for EngineConnection {
request_sent: tx, request_sent: tx,
}) })
.await .await
.map_err(|e| KclError::new_engine(KclErrorDetails::new(format!("Failed to send debug: {}", e), vec![])))?; .map_err(|e| KclError::new_engine(KclErrorDetails::new(format!("Failed to send debug: {e}"), vec![])))?;
let _ = rx.await; let _ = rx.await;
Ok(()) Ok(())
@ -463,7 +463,7 @@ impl EngineManager for EngineConnection {
.await .await
.map_err(|e| { .map_err(|e| {
KclError::new_engine(KclErrorDetails::new( KclError::new_engine(KclErrorDetails::new(
format!("Failed to send modeling command: {}", e), format!("Failed to send modeling command: {e}"),
vec![source_range], vec![source_range],
)) ))
})?; })?;
@ -533,7 +533,7 @@ impl EngineManager for EngineConnection {
} }
Err(KclError::new_engine(KclErrorDetails::new( Err(KclError::new_engine(KclErrorDetails::new(
format!("Modeling command timed out `{}`", id), format!("Modeling command timed out `{id}`"),
vec![source_range], vec![source_range],
))) )))
} }

View File

@ -12,16 +12,16 @@ use kcmc::{
WebSocketResponse, WebSocketResponse,
}, },
}; };
use kittycad_modeling_cmds::{self as kcmc, websocket::ModelingCmdReq, ImportFiles, ModelingCmd}; use kittycad_modeling_cmds::{self as kcmc, ImportFiles, ModelingCmd, websocket::ModelingCmdReq};
use tokio::sync::RwLock; use tokio::sync::RwLock;
use uuid::Uuid; use uuid::Uuid;
use crate::{ use crate::{
SourceRange,
engine::{AsyncTasks, EngineStats}, engine::{AsyncTasks, EngineStats},
errors::KclError, errors::KclError,
exec::DefaultPlanes, exec::DefaultPlanes,
execution::IdGenerator, execution::IdGenerator,
SourceRange,
}; };
#[derive(Debug, Clone)] #[derive(Debug, Clone)]

View File

@ -11,10 +11,10 @@ use uuid::Uuid;
use wasm_bindgen::prelude::*; use wasm_bindgen::prelude::*;
use crate::{ use crate::{
SourceRange,
engine::{AsyncTasks, EngineStats}, engine::{AsyncTasks, EngineStats},
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::{DefaultPlanes, IdGenerator}, execution::{DefaultPlanes, IdGenerator},
SourceRange,
}; };
#[wasm_bindgen(module = "/../../src/lang/std/engineConnection.ts")] #[wasm_bindgen(module = "/../../src/lang/std/engineConnection.ts")]

View File

@ -12,15 +12,15 @@ pub mod conn_wasm;
use std::{ use std::{
collections::HashMap, collections::HashMap,
sync::{ sync::{
atomic::{AtomicUsize, Ordering},
Arc, Arc,
atomic::{AtomicUsize, Ordering},
}, },
}; };
pub use async_tasks::AsyncTasks; pub use async_tasks::AsyncTasks;
use indexmap::IndexMap; use indexmap::IndexMap;
use kcmc::{ use kcmc::{
each_cmd as mcmd, ModelingCmd, each_cmd as mcmd,
length_unit::LengthUnit, length_unit::LengthUnit,
ok_response::OkModelingCmdResponse, ok_response::OkModelingCmdResponse,
shared::Color, shared::Color,
@ -28,7 +28,6 @@ use kcmc::{
BatchResponse, ModelingBatch, ModelingCmdReq, ModelingSessionData, OkWebSocketResponseData, WebSocketRequest, BatchResponse, ModelingBatch, ModelingCmdReq, ModelingSessionData, OkWebSocketResponseData, WebSocketRequest,
WebSocketResponse, WebSocketResponse,
}, },
ModelingCmd,
}; };
use kittycad_modeling_cmds as kcmc; use kittycad_modeling_cmds as kcmc;
use parse_display::{Display, FromStr}; use parse_display::{Display, FromStr};
@ -39,9 +38,9 @@ use uuid::Uuid;
use web_time::Instant; use web_time::Instant;
use crate::{ use crate::{
errors::{KclError, KclErrorDetails},
execution::{types::UnitLen, DefaultPlanes, IdGenerator, PlaneInfo, Point3d},
SourceRange, SourceRange,
errors::{KclError, KclErrorDetails},
execution::{DefaultPlanes, IdGenerator, PlaneInfo, Point3d, types::UnitLen},
}; };
lazy_static::lazy_static! { lazy_static::lazy_static! {
@ -291,7 +290,10 @@ pub trait EngineManager: std::fmt::Debug + Send + Sync + 'static {
// the artifact graph won't care either if its gone since you can't select it // the artifact graph won't care either if its gone since you can't select it
// anymore anyways. // anymore anyways.
if let Err(err) = self.async_tasks().join_all().await { if let Err(err) = self.async_tasks().join_all().await {
crate::log::logln!("Error waiting for async tasks (this is typically fine and just means that an edge became something else): {:?}", err); crate::log::logln!(
"Error waiting for async tasks (this is typically fine and just means that an edge became something else): {:?}",
err
);
} }
// Flush the batch to make sure nothing remains. // Flush the batch to make sure nothing remains.
@ -499,7 +501,7 @@ pub trait EngineManager: std::fmt::Debug + Send + Sync + 'static {
} }
_ => { _ => {
return Err(KclError::new_engine(KclErrorDetails::new( return Err(KclError::new_engine(KclErrorDetails::new(
format!("The request is not a modeling command: {:?}", req), format!("The request is not a modeling command: {req:?}"),
vec![*range], vec![*range],
))); )));
} }
@ -529,7 +531,7 @@ pub trait EngineManager: std::fmt::Debug + Send + Sync + 'static {
} else { } else {
// We should never get here. // We should never get here.
Err(KclError::new_engine(KclErrorDetails::new( Err(KclError::new_engine(KclErrorDetails::new(
format!("Failed to get batch response: {:?}", response), format!("Failed to get batch response: {response:?}"),
vec![source_range], vec![source_range],
))) )))
} }
@ -544,7 +546,7 @@ pub trait EngineManager: std::fmt::Debug + Send + Sync + 'static {
// an error. // an error.
let source_range = id_to_source_range.get(cmd_id.as_ref()).cloned().ok_or_else(|| { let source_range = id_to_source_range.get(cmd_id.as_ref()).cloned().ok_or_else(|| {
KclError::new_engine(KclErrorDetails::new( KclError::new_engine(KclErrorDetails::new(
format!("Failed to get source range for command ID: {:?}", cmd_id), format!("Failed to get source range for command ID: {cmd_id:?}"),
vec![], vec![],
)) ))
})?; })?;
@ -554,7 +556,7 @@ pub trait EngineManager: std::fmt::Debug + Send + Sync + 'static {
self.parse_websocket_response(ws_resp, source_range) self.parse_websocket_response(ws_resp, source_range)
} }
_ => Err(KclError::new_engine(KclErrorDetails::new( _ => Err(KclError::new_engine(KclErrorDetails::new(
format!("The final request is not a modeling command: {:?}", final_req), format!("The final request is not a modeling command: {final_req:?}"),
vec![source_range], vec![source_range],
))), ))),
} }
@ -663,7 +665,7 @@ pub trait EngineManager: std::fmt::Debug + Send + Sync + 'static {
let info = DEFAULT_PLANE_INFO.get(&name).ok_or_else(|| { let info = DEFAULT_PLANE_INFO.get(&name).ok_or_else(|| {
// We should never get here. // We should never get here.
KclError::new_engine(KclErrorDetails::new( KclError::new_engine(KclErrorDetails::new(
format!("Failed to get default plane info for: {:?}", name), format!("Failed to get default plane info for: {name:?}"),
vec![source_range], vec![source_range],
)) ))
})?; })?;
@ -739,7 +741,7 @@ pub trait EngineManager: std::fmt::Debug + Send + Sync + 'static {
// Get the source range for the command. // Get the source range for the command.
let source_range = id_to_source_range.get(cmd_id).cloned().ok_or_else(|| { let source_range = id_to_source_range.get(cmd_id).cloned().ok_or_else(|| {
KclError::new_engine(KclErrorDetails::new( KclError::new_engine(KclErrorDetails::new(
format!("Failed to get source range for command ID: {:?}", cmd_id), format!("Failed to get source range for command ID: {cmd_id:?}"),
vec![], vec![],
)) ))
})?; })?;
@ -754,7 +756,7 @@ pub trait EngineManager: std::fmt::Debug + Send + Sync + 'static {
// Return an error that we did not get an error or the response we wanted. // Return an error that we did not get an error or the response we wanted.
// This should never happen but who knows. // This should never happen but who knows.
Err(KclError::new_engine(KclErrorDetails::new( Err(KclError::new_engine(KclErrorDetails::new(
format!("Failed to find response for command ID: {:?}", id), format!("Failed to find response for command ID: {id:?}"),
vec![], vec![],
))) )))
} }

View File

@ -7,11 +7,11 @@ use tower_lsp::lsp_types::{Diagnostic, DiagnosticSeverity};
#[cfg(feature = "artifact-graph")] #[cfg(feature = "artifact-graph")]
use crate::execution::{ArtifactCommand, ArtifactGraph, Operation}; use crate::execution::{ArtifactCommand, ArtifactGraph, Operation};
use crate::{ use crate::{
ModuleId,
execution::DefaultPlanes, execution::DefaultPlanes,
lsp::IntoDiagnostic, lsp::IntoDiagnostic,
modules::{ModulePath, ModuleSource}, modules::{ModulePath, ModuleSource},
source_range::SourceRange, source_range::SourceRange,
ModuleId,
}; };
/// How did the KCL execution fail /// How did the KCL execution fail

View File

@ -2,13 +2,13 @@
use std::str::FromStr; use std::str::FromStr;
use kittycad_modeling_cmds::coord::{System, KITTYCAD, OPENGL, VULKAN}; use kittycad_modeling_cmds::coord::{KITTYCAD, OPENGL, System, VULKAN};
use crate::{ use crate::{
KclError, SourceRange,
errors::KclErrorDetails, errors::KclErrorDetails,
execution::types::{UnitAngle, UnitLen}, execution::types::{UnitAngle, UnitLen},
parsing::ast::types::{Annotation, Expr, LiteralValue, Node, ObjectProperty}, parsing::ast::types::{Annotation, Expr, LiteralValue, Node, ObjectProperty},
KclError, SourceRange,
}; };
/// Annotations which should cause re-execution if they change. /// Annotations which should cause re-execution if they change.

View File

@ -1,20 +1,19 @@
use fnv::FnvHashMap; use fnv::FnvHashMap;
use indexmap::IndexMap; use indexmap::IndexMap;
use kittycad_modeling_cmds::{ use kittycad_modeling_cmds::{
self as kcmc, self as kcmc, EnableSketchMode, ModelingCmd,
ok_response::OkModelingCmdResponse, ok_response::OkModelingCmdResponse,
shared::ExtrusionFaceCapType, shared::ExtrusionFaceCapType,
websocket::{BatchResponse, OkWebSocketResponseData, WebSocketResponse}, websocket::{BatchResponse, OkWebSocketResponseData, WebSocketResponse},
EnableSketchMode, ModelingCmd,
}; };
use serde::{ser::SerializeSeq, Serialize}; use serde::{Serialize, ser::SerializeSeq};
use uuid::Uuid; use uuid::Uuid;
use crate::{ use crate::{
KclError, NodePath, SourceRange,
errors::KclErrorDetails, errors::KclErrorDetails,
execution::ArtifactId, execution::ArtifactId,
parsing::ast::types::{Node, Program}, parsing::ast::types::{Node, Program},
KclError, NodePath, SourceRange,
}; };
#[cfg(test)] #[cfg(test)]
@ -893,7 +892,10 @@ fn artifacts_to_update(
), ),
}; };
if original_path_ids.len() != face_edge_infos.len() { if original_path_ids.len() != face_edge_infos.len() {
internal_error!(range, "EntityMirror or EntityMirrorAcrossEdge response has different number face edge info than original mirrored paths: id={id:?}, cmd={cmd:?}, response={response:?}"); internal_error!(
range,
"EntityMirror or EntityMirrorAcrossEdge response has different number face edge info than original mirrored paths: id={id:?}, cmd={cmd:?}, response={response:?}"
);
} }
let mut return_arr = Vec::new(); let mut return_arr = Vec::new();
for (face_edge_info, original_path_id) in face_edge_infos.iter().zip(original_path_ids) { for (face_edge_info, original_path_id) in face_edge_infos.iter().zip(original_path_ids) {
@ -909,7 +911,10 @@ fn artifacts_to_update(
// of its info. // of its info.
let Some(Artifact::Path(original_path)) = artifacts.get(&original_path_id) else { let Some(Artifact::Path(original_path)) = artifacts.get(&original_path_id) else {
// We couldn't find the original path. This is a bug. // We couldn't find the original path. This is a bug.
internal_error!(range, "Couldn't find original path for mirror2d: original_path_id={original_path_id:?}, cmd={cmd:?}"); internal_error!(
range,
"Couldn't find original path for mirror2d: original_path_id={original_path_id:?}, cmd={cmd:?}"
);
}; };
Path { Path {
id: path_id, id: path_id,

View File

@ -268,7 +268,7 @@ impl ArtifactGraph {
for (group_id, artifact_ids) in groups { for (group_id, artifact_ids) in groups {
let group_id = *stable_id_map.get(&group_id).unwrap(); let group_id = *stable_id_map.get(&group_id).unwrap();
writeln!(output, "{prefix}subgraph path{group_id} [Path]")?; writeln!(output, "{prefix}subgraph path{group_id} [Path]")?;
let indented = format!("{} ", prefix); let indented = format!("{prefix} ");
for artifact_id in artifact_ids { for artifact_id in artifact_ids {
let artifact = self.map.get(&artifact_id).unwrap(); let artifact = self.map.get(&artifact_id).unwrap();
let id = *stable_id_map.get(&artifact_id).unwrap(); let id = *stable_id_map.get(&artifact_id).unwrap();
@ -353,7 +353,7 @@ impl ArtifactGraph {
node_path_display(output, prefix, None, &segment.code_ref)?; node_path_display(output, prefix, None, &segment.code_ref)?;
} }
Artifact::Solid2d(_solid2d) => { Artifact::Solid2d(_solid2d) => {
writeln!(output, "{prefix}{}[Solid2d]", id)?; writeln!(output, "{prefix}{id}[Solid2d]")?;
} }
Artifact::StartSketchOnFace(StartSketchOnFace { code_ref, .. }) => { Artifact::StartSketchOnFace(StartSketchOnFace { code_ref, .. }) => {
writeln!( writeln!(
@ -494,24 +494,24 @@ impl ArtifactGraph {
match edge.flow { match edge.flow {
EdgeFlow::SourceToTarget => match edge.direction { EdgeFlow::SourceToTarget => match edge.direction {
EdgeDirection::Forward => { EdgeDirection::Forward => {
writeln!(output, "{prefix}{source_id} x{}--> {}", extra, target_id)?; writeln!(output, "{prefix}{source_id} x{extra}--> {target_id}")?;
} }
EdgeDirection::Backward => { EdgeDirection::Backward => {
writeln!(output, "{prefix}{source_id} <{}--x {}", extra, target_id)?; writeln!(output, "{prefix}{source_id} <{extra}--x {target_id}")?;
} }
EdgeDirection::Bidirectional => { EdgeDirection::Bidirectional => {
writeln!(output, "{prefix}{source_id} {}--- {}", extra, target_id)?; writeln!(output, "{prefix}{source_id} {extra}--- {target_id}")?;
} }
}, },
EdgeFlow::TargetToSource => match edge.direction { EdgeFlow::TargetToSource => match edge.direction {
EdgeDirection::Forward => { EdgeDirection::Forward => {
writeln!(output, "{prefix}{target_id} x{}--> {}", extra, source_id)?; writeln!(output, "{prefix}{target_id} x{extra}--> {source_id}")?;
} }
EdgeDirection::Backward => { EdgeDirection::Backward => {
writeln!(output, "{prefix}{target_id} <{}--x {}", extra, source_id)?; writeln!(output, "{prefix}{target_id} <{extra}--x {source_id}")?;
} }
EdgeDirection::Bidirectional => { EdgeDirection::Bidirectional => {
writeln!(output, "{prefix}{target_id} {}--- {}", extra, source_id)?; writeln!(output, "{prefix}{target_id} {extra}--- {source_id}")?;
} }
}, },
} }

View File

@ -6,15 +6,14 @@ use itertools::{EitherOrBoth, Itertools};
use tokio::sync::RwLock; use tokio::sync::RwLock;
use crate::{ use crate::{
ExecOutcome, ExecutorContext,
execution::{ execution::{
annotations, EnvironmentRef, ExecutorSettings, annotations,
memory::Stack, memory::Stack,
state::{self as exec_state, ModuleInfoMap}, state::{self as exec_state, ModuleInfoMap},
EnvironmentRef, ExecutorSettings,
}, },
parsing::ast::types::{Annotation, Node, Program}, parsing::ast::types::{Annotation, Node, Program},
walk::Node as WalkNode, walk::Node as WalkNode,
ExecOutcome, ExecutorContext,
}; };
lazy_static::lazy_static! { lazy_static::lazy_static! {
@ -337,7 +336,7 @@ mod tests {
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
use super::*; use super::*;
use crate::execution::{parse_execute, parse_execute_with_project_dir, ExecTestResults}; use crate::execution::{ExecTestResults, parse_execute, parse_execute_with_project_dir};
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_get_changed_program_same_code() { async fn test_get_changed_program_same_code() {
@ -755,7 +754,7 @@ extrude(profile001, length = 100)"#
.await; .await;
let CacheResult::CheckImportsOnly { reapply_settings, .. } = result else { let CacheResult::CheckImportsOnly { reapply_settings, .. } = result else {
panic!("Expected CheckImportsOnly, got {:?}", result); panic!("Expected CheckImportsOnly, got {result:?}");
}; };
assert_eq!(reapply_settings, false); assert_eq!(reapply_settings, false);
@ -839,7 +838,7 @@ extrude(profile001, length = 100)
.await; .await;
let CacheResult::CheckImportsOnly { reapply_settings, .. } = result else { let CacheResult::CheckImportsOnly { reapply_settings, .. } = result else {
panic!("Expected CheckImportsOnly, got {:?}", result); panic!("Expected CheckImportsOnly, got {result:?}");
}; };
assert_eq!(reapply_settings, false); assert_eq!(reapply_settings, false);

View File

@ -1,10 +1,10 @@
use indexmap::IndexMap; use indexmap::IndexMap;
use serde::Serialize; use serde::Serialize;
use super::{types::NumericType, ArtifactId, KclValue}; use super::{ArtifactId, KclValue, types::NumericType};
#[cfg(feature = "artifact-graph")] #[cfg(feature = "artifact-graph")]
use crate::parsing::ast::types::{Node, Program}; use crate::parsing::ast::types::{Node, Program};
use crate::{parsing::ast::types::ItemVisibility, ModuleId, NodePath, SourceRange}; use crate::{ModuleId, NodePath, SourceRange, parsing::ast::types::ItemVisibility};
/// A CAD modeling operation for display in the feature tree, AKA operations /// A CAD modeling operation for display in the feature tree, AKA operations
/// timeline. /// timeline.
@ -57,7 +57,7 @@ impl Operation {
/// If the variant is `StdLibCall`, set the `is_error` field. /// If the variant is `StdLibCall`, set the `is_error` field.
pub(crate) fn set_std_lib_call_is_error(&mut self, is_err: bool) { pub(crate) fn set_std_lib_call_is_error(&mut self, is_err: bool) {
match self { match self {
Self::StdLibCall { ref mut is_error, .. } => *is_error = is_err, Self::StdLibCall { is_error, .. } => *is_error = is_err,
Self::VariableDeclaration { .. } | Self::GroupBegin { .. } | Self::GroupEnd => {} Self::VariableDeclaration { .. } | Self::GroupBegin { .. } | Self::GroupEnd => {}
} }
} }
@ -226,10 +226,7 @@ impl From<&KclValue> for OpKclValue {
match value { match value {
KclValue::Uuid { value, .. } => Self::Uuid { value: *value }, KclValue::Uuid { value, .. } => Self::Uuid { value: *value },
KclValue::Bool { value, .. } => Self::Bool { value: *value }, KclValue::Bool { value, .. } => Self::Bool { value: *value },
KclValue::Number { value, ty, .. } => Self::Number { KclValue::Number { value, ty, .. } => Self::Number { value: *value, ty: *ty },
value: *value,
ty: ty.clone(),
},
KclValue::String { value, .. } => Self::String { value: value.clone() }, KclValue::String { value, .. } => Self::String { value: value.clone() },
KclValue::Tuple { value, .. } | KclValue::HomArray { value, .. } => { KclValue::Tuple { value, .. } | KclValue::HomArray { value, .. } => {
let value = value.iter().map(Self::from).collect(); let value = value.iter().map(Self::from).collect();

View File

@ -3,17 +3,17 @@ use std::collections::HashMap;
use async_recursion::async_recursion; use async_recursion::async_recursion;
use crate::{ use crate::{
CompilationError, NodePath,
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::{ execution::{
annotations, BodyType, EnvironmentRef, ExecState, ExecutorContext, KclValue, Metadata, ModelingCmdMeta, ModuleArtifactState,
Operation, PlaneType, StatementKind, TagIdentifier, annotations,
cad_op::OpKclValue, cad_op::OpKclValue,
fn_call::Args, fn_call::Args,
kcl_value::{FunctionSource, TypeDef}, kcl_value::{FunctionSource, TypeDef},
memory, memory,
state::ModuleState, state::ModuleState,
types::{NumericType, PrimitiveType, RuntimeType}, types::{NumericType, PrimitiveType, RuntimeType},
BodyType, EnvironmentRef, ExecState, ExecutorContext, KclValue, Metadata, ModelingCmdMeta, ModuleArtifactState,
Operation, PlaneType, StatementKind, TagIdentifier,
}, },
fmt, fmt,
modules::{ModuleId, ModulePath, ModuleRepr}, modules::{ModuleId, ModulePath, ModuleRepr},
@ -28,7 +28,6 @@ use crate::{
}, },
source_range::SourceRange, source_range::SourceRange,
std::args::TyF64, std::args::TyF64,
CompilationError, NodePath,
}; };
impl<'a> StatementKind<'a> { impl<'a> StatementKind<'a> {
@ -198,19 +197,23 @@ impl ExecutorContext {
} }
if ty.is_ok() && !module_exports.contains(&ty_name) { if ty.is_ok() && !module_exports.contains(&ty_name) {
ty = Err(KclError::new_semantic(KclErrorDetails::new(format!( ty = Err(KclError::new_semantic(KclErrorDetails::new(
"Cannot import \"{}\" from module because it is not exported. Add \"export\" before the definition to export it.", format!(
import_item.name.name "Cannot import \"{}\" from module because it is not exported. Add \"export\" before the definition to export it.",
), import_item.name.name
vec![SourceRange::from(&import_item.name)],))); ),
vec![SourceRange::from(&import_item.name)],
)));
} }
if mod_value.is_ok() && !module_exports.contains(&mod_name) { if mod_value.is_ok() && !module_exports.contains(&mod_name) {
mod_value = Err(KclError::new_semantic(KclErrorDetails::new(format!( mod_value = Err(KclError::new_semantic(KclErrorDetails::new(
"Cannot import \"{}\" from module because it is not exported. Add \"export\" before the definition to export it.", format!(
import_item.name.name "Cannot import \"{}\" from module because it is not exported. Add \"export\" before the definition to export it.",
), import_item.name.name
vec![SourceRange::from(&import_item.name)],))); ),
vec![SourceRange::from(&import_item.name)],
)));
} }
if value.is_err() && ty.is_err() && mod_value.is_err() { if value.is_err() && ty.is_err() && mod_value.is_err() {
@ -270,7 +273,7 @@ impl ExecutorContext {
.get_from(name, env_ref, source_range, 0) .get_from(name, env_ref, source_range, 0)
.map_err(|_err| { .map_err(|_err| {
KclError::new_internal(KclErrorDetails::new( KclError::new_internal(KclErrorDetails::new(
format!("{} is not defined in module (but was exported?)", name), format!("{name} is not defined in module (but was exported?)"),
vec![source_range], vec![source_range],
)) ))
})? })?
@ -431,7 +434,7 @@ impl ExecutorContext {
return Err(KclError::new_semantic(KclErrorDetails::new( return Err(KclError::new_semantic(KclErrorDetails::new(
"User-defined types are not yet supported.".to_owned(), "User-defined types are not yet supported.".to_owned(),
vec![metadata.source_range], vec![metadata.source_range],
))) )));
} }
}, },
} }
@ -792,11 +795,12 @@ fn var_in_own_ref_err(e: KclError, being_declared: &Option<String>) -> KclError
// TODO after June 26th: replace this with a let-chain, // TODO after June 26th: replace this with a let-chain,
// which will be available in Rust 1.88 // which will be available in Rust 1.88
// https://rust-lang.github.io/rfcs/2497-if-let-chains.html // https://rust-lang.github.io/rfcs/2497-if-let-chains.html
match (&being_declared, &name) { if let (Some(name0), Some(name1)) = (&being_declared, &name)
(Some(name0), Some(name1)) if name0 == name1 => { && name0 == name1
details.message = format!("You can't use `{name0}` because you're currently trying to define it. Use a different variable here instead."); {
} details.message = format!(
_ => {} "You can't use `{name0}` because you're currently trying to define it. Use a different variable here instead."
);
} }
KclError::UndefinedValue { details, name } KclError::UndefinedValue { details, name }
} }
@ -1042,6 +1046,16 @@ impl Node<MemberExpression> {
(KclValue::Solid { value }, Property::String(prop), false) if prop == "sketch" => Ok(KclValue::Sketch { (KclValue::Solid { value }, Property::String(prop), false) if prop == "sketch" => Ok(KclValue::Sketch {
value: Box::new(value.sketch), value: Box::new(value.sketch),
}), }),
(geometry @ KclValue::Solid { .. }, Property::String(prop), false) if prop == "tags" => {
// This is a common mistake.
Err(KclError::new_semantic(KclErrorDetails::new(
format!(
"Property `{prop}` not found on {}. You can get a solid's tags through its sketch, as in, `exampleSolid.sketch.tags`.",
geometry.human_friendly_type()
),
vec![self.clone().into()],
)))
}
(KclValue::Sketch { value: sk }, Property::String(prop), false) if prop == "tags" => Ok(KclValue::Object { (KclValue::Sketch { value: sk }, Property::String(prop), false) if prop == "tags" => Ok(KclValue::Object {
meta: vec![Metadata { meta: vec![Metadata {
source_range: SourceRange::from(self.clone()), source_range: SourceRange::from(self.clone()),
@ -1052,6 +1066,12 @@ impl Node<MemberExpression> {
.map(|(k, tag)| (k.to_owned(), KclValue::TagIdentifier(Box::new(tag.to_owned())))) .map(|(k, tag)| (k.to_owned(), KclValue::TagIdentifier(Box::new(tag.to_owned()))))
.collect(), .collect(),
}), }),
(geometry @ (KclValue::Sketch { .. } | KclValue::Solid { .. }), Property::String(property), false) => {
Err(KclError::new_semantic(KclErrorDetails::new(
format!("Property `{property}` not found on {}", geometry.human_friendly_type()),
vec![self.clone().into()],
)))
}
(being_indexed, _, _) => Err(KclError::new_semantic(KclErrorDetails::new( (being_indexed, _, _) => Err(KclError::new_semantic(KclErrorDetails::new(
format!( format!(
"Only arrays can be indexed, but you're trying to index {}", "Only arrays can be indexed, but you're trying to index {}",
@ -1077,7 +1097,7 @@ impl Node<BinaryExpression> {
(&left_value, &right_value) (&left_value, &right_value)
{ {
return Ok(KclValue::String { return Ok(KclValue::String {
value: format!("{}{}", left, right), value: format!("{left}{right}"),
meta, meta,
}); });
} }
@ -1237,7 +1257,9 @@ impl Node<BinaryExpression> {
exec_state.clear_units_warnings(&sr); exec_state.clear_units_warnings(&sr);
let mut err = CompilationError::err( let mut err = CompilationError::err(
sr, sr,
format!("{} numbers which have unknown or incompatible units.\nYou can probably fix this error by specifying the units using type ascription, e.g., `len: number(mm)` or `(a * b): number(deg)`.", verb), format!(
"{verb} numbers which have unknown or incompatible units.\nYou can probably fix this error by specifying the units using type ascription, e.g., `len: number(mm)` or `(a * b): number(deg)`."
),
); );
err.tag = crate::errors::Tag::UnknownNumericUnits; err.tag = crate::errors::Tag::UnknownNumericUnits;
exec_state.warn(err); exec_state.warn(err);
@ -1291,7 +1313,7 @@ impl Node<UnaryExpression> {
Ok(KclValue::Number { Ok(KclValue::Number {
value: -value, value: -value,
meta, meta,
ty: ty.clone(), ty: *ty,
}) })
} }
KclValue::Plane { value } => { KclValue::Plane { value } => {
@ -1323,7 +1345,7 @@ impl Node<UnaryExpression> {
.map(|v| match v { .map(|v| match v {
KclValue::Number { value, ty, meta } => Ok(KclValue::Number { KclValue::Number { value, ty, meta } => Ok(KclValue::Number {
value: *value * -1.0, value: *value * -1.0,
ty: ty.clone(), ty: *ty,
meta: meta.clone(), meta: meta.clone(),
}), }),
_ => Err(err()), _ => Err(err()),
@ -1344,7 +1366,7 @@ impl Node<UnaryExpression> {
.map(|v| match v { .map(|v| match v {
KclValue::Number { value, ty, meta } => Ok(KclValue::Number { KclValue::Number { value, ty, meta } => Ok(KclValue::Number {
value: *value * -1.0, value: *value * -1.0,
ty: ty.clone(), ty: *ty,
meta: meta.clone(), meta: meta.clone(),
}), }),
_ => Err(err()), _ => Err(err()),
@ -1417,7 +1439,7 @@ async fn inner_execute_pipe_body(
for expression in body { for expression in body {
if let Expr::TagDeclarator(_) = expression { if let Expr::TagDeclarator(_) = expression {
return Err(KclError::new_semantic(KclErrorDetails::new( return Err(KclError::new_semantic(KclErrorDetails::new(
format!("This cannot be in a PipeExpression: {:?}", expression), format!("This cannot be in a PipeExpression: {expression:?}"),
vec![expression.into()], vec![expression.into()],
))); )));
} }
@ -1538,7 +1560,7 @@ impl Node<ArrayRangeExpression> {
.into_iter() .into_iter()
.map(|num| KclValue::Number { .map(|num| KclValue::Number {
value: num as f64, value: num as f64,
ty: start_ty.clone(), ty: start_ty,
meta: meta.clone(), meta: meta.clone(),
}) })
.collect(), .collect(),
@ -1699,9 +1721,15 @@ fn jvalue_to_prop(value: &KclValue, property_sr: Vec<SourceRange>, name: &str) -
let make_err = let make_err =
|message: String| Err::<Property, _>(KclError::new_semantic(KclErrorDetails::new(message, property_sr))); |message: String| Err::<Property, _>(KclError::new_semantic(KclErrorDetails::new(message, property_sr)));
match value { match value {
n @ KclValue::Number{value: num, ty, .. } => { n @ KclValue::Number { value: num, ty, .. } => {
if !matches!(ty, NumericType::Known(crate::exec::UnitType::Count) | NumericType::Default { .. } | NumericType::Any ) { if !matches!(
return make_err(format!("arrays can only be indexed by non-dimensioned numbers, found {}", n.human_friendly_type())); ty,
NumericType::Known(crate::exec::UnitType::Count) | NumericType::Default { .. } | NumericType::Any
) {
return make_err(format!(
"arrays can only be indexed by non-dimensioned numbers, found {}",
n.human_friendly_type()
));
} }
let num = *num; let num = *num;
if num < 0.0 { if num < 0.0 {
@ -1711,13 +1739,15 @@ fn jvalue_to_prop(value: &KclValue, property_sr: Vec<SourceRange>, name: &str) -
if let Some(nearest_int) = nearest_int { if let Some(nearest_int) = nearest_int {
Ok(Property::UInt(nearest_int)) Ok(Property::UInt(nearest_int))
} else { } else {
make_err(format!("'{num}' is not an integer, so you can't index an array with it")) make_err(format!(
"'{num}' is not an integer, so you can't index an array with it"
))
} }
} }
KclValue::String{value: x, meta:_} => Ok(Property::String(x.to_owned())), KclValue::String { value: x, meta: _ } => Ok(Property::String(x.to_owned())),
_ => { _ => make_err(format!(
make_err(format!("{name} is not a valid property/index, you can only use a string to get the property of an object, or an int (>= 0) to get an item in an array")) "{name} is not a valid property/index, you can only use a string to get the property of an object, or an int (>= 0) to get an item in an array"
} )),
} }
} }
@ -1745,9 +1775,9 @@ mod test {
use super::*; use super::*;
use crate::{ use crate::{
exec::UnitType,
execution::{parse_execute, ContextType},
ExecutorSettings, UnitLen, ExecutorSettings, UnitLen,
exec::UnitType,
execution::{ContextType, parse_execute},
}; };
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
@ -1777,7 +1807,7 @@ arr1 = [42]: [number(cm)]
.get_from("arr1", result.mem_env, SourceRange::default(), 0) .get_from("arr1", result.mem_env, SourceRange::default(), 0)
.unwrap(); .unwrap();
if let KclValue::HomArray { value, ty } = arr1 { if let KclValue::HomArray { value, ty } = arr1 {
assert_eq!(value.len(), 1, "Expected Vec with specific length: found {:?}", value); assert_eq!(value.len(), 1, "Expected Vec with specific length: found {value:?}");
assert_eq!(*ty, RuntimeType::known_length(UnitLen::Cm)); assert_eq!(*ty, RuntimeType::known_length(UnitLen::Cm));
// Compare, ignoring meta. // Compare, ignoring meta.
if let KclValue::Number { value, ty, .. } = &value[0] { if let KclValue::Number { value, ty, .. } = &value[0] {
@ -1946,7 +1976,7 @@ d = b + c
.await .await
.map_err(|err| { .map_err(|err| {
KclError::new_internal(KclErrorDetails::new( KclError::new_internal(KclErrorDetails::new(
format!("Failed to create mock engine connection: {}", err), format!("Failed to create mock engine connection: {err}"),
vec![SourceRange::default()], vec![SourceRange::default()],
)) ))
}) })

View File

@ -2,19 +2,19 @@ use async_recursion::async_recursion;
use indexmap::IndexMap; use indexmap::IndexMap;
use crate::{ use crate::{
CompilationError, NodePath,
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::{ execution::{
BodyType, EnvironmentRef, ExecState, ExecutorContext, KclValue, Metadata, StatementKind, TagEngineInfo,
TagIdentifier,
cad_op::{Group, OpArg, OpKclValue, Operation}, cad_op::{Group, OpArg, OpKclValue, Operation},
kcl_value::FunctionSource, kcl_value::FunctionSource,
memory, memory,
types::RuntimeType, types::RuntimeType,
BodyType, EnvironmentRef, ExecState, ExecutorContext, KclValue, Metadata, StatementKind, TagEngineInfo,
TagIdentifier,
}, },
parsing::ast::types::{CallExpressionKw, DefaultParamVal, FunctionExpression, Node, Program, Type}, parsing::ast::types::{CallExpressionKw, DefaultParamVal, FunctionExpression, Node, Program, Type},
source_range::SourceRange, source_range::SourceRange,
std::StdFn, std::StdFn,
CompilationError, NodePath,
}; };
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
@ -269,7 +269,7 @@ impl Node<CallExpressionKw> {
}; };
KclError::new_undefined_value( KclError::new_undefined_value(
KclErrorDetails::new( KclErrorDetails::new(
format!("Result of user-defined function {} is undefined", fn_name), format!("Result of user-defined function {fn_name} is undefined"),
source_ranges, source_ranges,
), ),
None, None,
@ -401,7 +401,7 @@ impl FunctionDefinition<'_> {
impl FunctionBody<'_> { impl FunctionBody<'_> {
fn prep_mem(&self, exec_state: &mut ExecState) { fn prep_mem(&self, exec_state: &mut ExecState) {
match self { match self {
FunctionBody::Rust(_) => exec_state.mut_stack().push_new_env_for_rust_call(), FunctionBody::Rust(_) => exec_state.mut_stack().push_new_root_env(true),
FunctionBody::Kcl(_, memory) => exec_state.mut_stack().push_new_env_for_call(*memory), FunctionBody::Kcl(_, memory) => exec_state.mut_stack().push_new_env_for_call(*memory),
} }
} }
@ -445,7 +445,7 @@ fn update_memory_for_tags_of_geometry(result: &mut KclValue, exec_state: &mut Ex
} }
} }
} }
KclValue::Solid { ref mut value } => { KclValue::Solid { value } => {
for v in &value.value { for v in &value.value {
if let Some(tag) = v.get_tag() { if let Some(tag) = v.get_tag() {
// Get the past tag and update it. // Get the past tag and update it.
@ -555,9 +555,9 @@ fn type_err_str(expected: &Type, found: &KclValue, source_range: &SourceRange, e
let found_human = found.human_friendly_type(); let found_human = found.human_friendly_type();
let found_ty = found.principal_type_string(); let found_ty = found.principal_type_string();
let found_str = if found_human == found_ty || found_human == format!("a {}", strip_backticks(&found_ty)) { let found_str = if found_human == found_ty || found_human == format!("a {}", strip_backticks(&found_ty)) {
format!("a value with type {}", found_ty) format!("a value with type {found_ty}")
} else { } else {
format!("{found_human} (with type {})", found_ty) format!("{found_human} (with type {found_ty})")
}; };
let mut result = format!("{expected_str}, but found {found_str}."); let mut result = format!("{expected_str}, but found {found_str}.");
@ -626,7 +626,7 @@ fn type_check_params_kw(
format!( format!(
"`{label}` is not an argument of {}", "`{label}` is not an argument of {}",
fn_name fn_name
.map(|n| format!("`{}`", n)) .map(|n| format!("`{n}`"))
.unwrap_or_else(|| "this function".to_owned()), .unwrap_or_else(|| "this function".to_owned()),
), ),
)); ));
@ -676,7 +676,7 @@ fn type_check_params_kw(
format!( format!(
"The input argument of {} requires {}", "The input argument of {} requires {}",
fn_name fn_name
.map(|n| format!("`{}`", n)) .map(|n| format!("`{n}`"))
.unwrap_or_else(|| "this function".to_owned()), .unwrap_or_else(|| "this function".to_owned()),
type_err_str(ty, &arg.1.value, &arg.1.source_range, exec_state), type_err_str(ty, &arg.1.value, &arg.1.source_range, exec_state),
), ),
@ -691,7 +691,7 @@ fn type_check_params_kw(
format!( format!(
"{} expects an unlabeled first argument (`@{name}`), but it is labelled in the call", "{} expects an unlabeled first argument (`@{name}`), but it is labelled in the call",
fn_name fn_name
.map(|n| format!("The function `{}`", n)) .map(|n| format!("The function `{n}`"))
.unwrap_or_else(|| "This function".to_owned()), .unwrap_or_else(|| "This function".to_owned()),
), ),
)); ));
@ -721,7 +721,7 @@ fn assign_args_to_params_kw(
)?; )?;
} }
None => match default { None => match default {
Some(ref default_val) => { Some(default_val) => {
let value = KclValue::from_default_param(default_val.clone(), exec_state); let value = KclValue::from_default_param(default_val.clone(), exec_state);
exec_state exec_state
.mut_stack() .mut_stack()
@ -729,10 +729,7 @@ fn assign_args_to_params_kw(
} }
None => { None => {
return Err(KclError::new_semantic(KclErrorDetails::new( return Err(KclError::new_semantic(KclErrorDetails::new(
format!( format!("This function requires a parameter {name}, but you haven't passed it one."),
"This function requires a parameter {}, but you haven't passed it one.",
name
),
source_ranges, source_ranges,
))); )));
} }
@ -746,7 +743,9 @@ fn assign_args_to_params_kw(
let Some(unlabeled) = unlabelled else { let Some(unlabeled) = unlabelled else {
return Err(if args.kw_args.labeled.contains_key(param_name) { return Err(if args.kw_args.labeled.contains_key(param_name) {
KclError::new_semantic(KclErrorDetails::new( KclError::new_semantic(KclErrorDetails::new(
format!("The function does declare a parameter named '{param_name}', but this parameter doesn't use a label. Try removing the `{param_name}:`"), format!(
"The function does declare a parameter named '{param_name}', but this parameter doesn't use a label. Try removing the `{param_name}:`"
),
source_ranges, source_ranges,
)) ))
} else { } else {
@ -799,7 +798,7 @@ mod test {
use super::*; use super::*;
use crate::{ use crate::{
execution::{memory::Stack, parse_execute, types::NumericType, ContextType}, execution::{ContextType, memory::Stack, parse_execute, types::NumericType},
parsing::ast::types::{DefaultParamVal, Identifier, Parameter}, parsing::ast::types::{DefaultParamVal, Identifier, Parameter},
}; };

View File

@ -3,16 +3,16 @@ use std::ops::{Add, AddAssign, Mul};
use anyhow::Result; use anyhow::Result;
use indexmap::IndexMap; use indexmap::IndexMap;
use kittycad_modeling_cmds as kcmc; use kittycad_modeling_cmds as kcmc;
use kittycad_modeling_cmds::{each_cmd as mcmd, length_unit::LengthUnit, websocket::ModelingCmdReq, ModelingCmd}; use kittycad_modeling_cmds::{ModelingCmd, each_cmd as mcmd, length_unit::LengthUnit, websocket::ModelingCmdReq};
use parse_display::{Display, FromStr}; use parse_display::{Display, FromStr};
use schemars::JsonSchema; use schemars::JsonSchema;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use crate::{ use crate::{
engine::{PlaneName, DEFAULT_PLANE_INFO}, engine::{DEFAULT_PLANE_INFO, PlaneName},
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::{ execution::{
types::NumericType, ArtifactId, ExecState, ExecutorContext, Metadata, TagEngineInfo, TagIdentifier, UnitLen, ArtifactId, ExecState, ExecutorContext, Metadata, TagEngineInfo, TagIdentifier, UnitLen, types::NumericType,
}, },
parsing::ast::types::{Node, NodeRef, TagDeclarator, TagNode}, parsing::ast::types::{Node, NodeRef, TagDeclarator, TagNode},
std::{args::TyF64, sketch::PlaneData}, std::{args::TyF64, sketch::PlaneData},
@ -472,7 +472,7 @@ impl TryFrom<PlaneData> for PlaneInfo {
PlaneData::Plane(_) => { PlaneData::Plane(_) => {
// We will never get here since we already checked for PlaneData::Plane. // We will never get here since we already checked for PlaneData::Plane.
return Err(KclError::new_internal(KclErrorDetails::new( return Err(KclError::new_internal(KclErrorDetails::new(
format!("PlaneData {:?} not found", value), format!("PlaneData {value:?} not found"),
Default::default(), Default::default(),
))); )));
} }
@ -480,7 +480,7 @@ impl TryFrom<PlaneData> for PlaneInfo {
let info = DEFAULT_PLANE_INFO.get(&name).ok_or_else(|| { let info = DEFAULT_PLANE_INFO.get(&name).ok_or_else(|| {
KclError::new_internal(KclErrorDetails::new( KclError::new_internal(KclErrorDetails::new(
format!("Plane {} not found", name), format!("Plane {name} not found"),
Default::default(), Default::default(),
)) ))
})?; })?;
@ -815,8 +815,8 @@ impl EdgeCut {
pub fn set_id(&mut self, id: uuid::Uuid) { pub fn set_id(&mut self, id: uuid::Uuid) {
match self { match self {
EdgeCut::Fillet { id: ref mut i, .. } => *i = id, EdgeCut::Fillet { id: i, .. } => *i = id,
EdgeCut::Chamfer { id: ref mut i, .. } => *i = id, EdgeCut::Chamfer { id: i, .. } => *i = id,
} }
} }
@ -829,8 +829,8 @@ impl EdgeCut {
pub fn set_edge_id(&mut self, id: uuid::Uuid) { pub fn set_edge_id(&mut self, id: uuid::Uuid) {
match self { match self {
EdgeCut::Fillet { edge_id: ref mut i, .. } => *i = id, EdgeCut::Fillet { edge_id: i, .. } => *i = id,
EdgeCut::Chamfer { edge_id: ref mut i, .. } => *i = id, EdgeCut::Chamfer { edge_id: i, .. } => *i = id,
} }
} }
@ -939,6 +939,7 @@ impl From<Point3d> for Point3D {
Self { x: p.x, y: p.y, z: p.z } Self { x: p.x, y: p.y, z: p.z }
} }
} }
impl From<Point3d> for kittycad_modeling_cmds::shared::Point3d<LengthUnit> { impl From<Point3d> for kittycad_modeling_cmds::shared::Point3d<LengthUnit> {
fn from(p: Point3d) -> Self { fn from(p: Point3d) -> Self {
Self { Self {
@ -1004,12 +1005,12 @@ pub struct BasePath {
impl BasePath { impl BasePath {
pub fn get_to(&self) -> [TyF64; 2] { pub fn get_to(&self) -> [TyF64; 2] {
let ty: NumericType = self.units.into(); let ty: NumericType = self.units.into();
[TyF64::new(self.to[0], ty.clone()), TyF64::new(self.to[1], ty)] [TyF64::new(self.to[0], ty), TyF64::new(self.to[1], ty)]
} }
pub fn get_from(&self) -> [TyF64; 2] { pub fn get_from(&self) -> [TyF64; 2] {
let ty: NumericType = self.units.into(); let ty: NumericType = self.units.into();
[TyF64::new(self.from[0], ty.clone()), TyF64::new(self.from[1], ty)] [TyF64::new(self.from[0], ty), TyF64::new(self.from[1], ty)]
} }
} }
@ -1030,7 +1031,7 @@ pub struct GeoMeta {
#[ts(export)] #[ts(export)]
#[serde(tag = "type")] #[serde(tag = "type")]
pub enum Path { pub enum Path {
/// A path that goes to a point. /// A straight line which ends at the given point.
ToPoint { ToPoint {
#[serde(flatten)] #[serde(flatten)]
base: BasePath, base: BasePath,
@ -1225,14 +1226,14 @@ impl Path {
pub fn get_from(&self) -> [TyF64; 2] { pub fn get_from(&self) -> [TyF64; 2] {
let p = &self.get_base().from; let p = &self.get_base().from;
let ty: NumericType = self.get_base().units.into(); let ty: NumericType = self.get_base().units.into();
[TyF64::new(p[0], ty.clone()), TyF64::new(p[1], ty)] [TyF64::new(p[0], ty), TyF64::new(p[1], ty)]
} }
/// Where does this path segment end? /// Where does this path segment end?
pub fn get_to(&self) -> [TyF64; 2] { pub fn get_to(&self) -> [TyF64; 2] {
let p = &self.get_base().to; let p = &self.get_base().to;
let ty: NumericType = self.get_base().units.into(); let ty: NumericType = self.get_base().units.into();
[TyF64::new(p[0], ty.clone()), TyF64::new(p[1], ty)] [TyF64::new(p[0], ty), TyF64::new(p[1], ty)]
} }
/// The path segment start point and its type. /// The path segment start point and its type.

View File

@ -2,12 +2,12 @@ use std::str::FromStr;
use anyhow::Result; use anyhow::Result;
use kcmc::{ use kcmc::{
coord::{System, KITTYCAD}, ImportFile, ModelingCmd,
coord::{KITTYCAD, System},
each_cmd as mcmd, each_cmd as mcmd,
format::InputFormat3d, format::InputFormat3d,
shared::FileImportFormat, shared::FileImportFormat,
units::UnitLength, units::UnitLength,
ImportFile, ModelingCmd,
}; };
use kittycad_modeling_cmds as kcmc; use kittycad_modeling_cmds as kcmc;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
@ -16,8 +16,8 @@ use uuid::Uuid;
use crate::{ use crate::{
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::{ execution::{
annotations, typed_path::TypedPath, types::UnitLen, ExecState, ExecutorContext, ImportedGeometry, ExecState, ExecutorContext, ImportedGeometry, ModelingCmdMeta, annotations, typed_path::TypedPath,
ModelingCmdMeta, types::UnitLen,
}, },
fs::FileSystem, fs::FileSystem,
parsing::ast::types::{Annotation, Node}, parsing::ast::types::{Annotation, Node},
@ -184,7 +184,7 @@ pub(super) fn format_from_annotations(
annotations::IMPORT_LENGTH_UNIT annotations::IMPORT_LENGTH_UNIT
), ),
vec![p.as_source_range()], vec![p.as_source_range()],
))) )));
} }
} }
} }
@ -225,7 +225,7 @@ fn set_coords(fmt: &mut InputFormat3d, coords_str: &str, source_range: SourceRan
annotations::IMPORT_COORDS annotations::IMPORT_COORDS
), ),
vec![source_range], vec![source_range],
))) )));
} }
} }
@ -246,7 +246,7 @@ fn set_length_unit(fmt: &mut InputFormat3d, units_str: &str, source_range: Sourc
annotations::IMPORT_LENGTH_UNIT annotations::IMPORT_LENGTH_UNIT
), ),
vec![source_range], vec![source_range],
))) )));
} }
} }
@ -291,7 +291,9 @@ fn get_import_format_from_extension(ext: &str) -> Result<InputFormat3d> {
} else if ext == "glb" { } else if ext == "glb" {
FileImportFormat::Gltf FileImportFormat::Gltf
} else { } else {
anyhow::bail!("unknown source format for file extension: {ext}. Try setting the `--src-format` flag explicitly or use a valid format.") anyhow::bail!(
"unknown source format for file extension: {ext}. Try setting the `--src-format` flag explicitly or use a valid format."
)
} }
} }
}; };

View File

@ -6,12 +6,12 @@ use std::{
use anyhow::Result; use anyhow::Result;
use crate::{ use crate::{
ExecState, ExecutorContext, KclError, ModuleId, SourceRange,
errors::KclErrorDetails, errors::KclErrorDetails,
execution::typed_path::TypedPath, execution::typed_path::TypedPath,
modules::{ModulePath, ModuleRepr}, modules::{ModulePath, ModuleRepr},
parsing::ast::types::{ImportPath, ImportStatement, Node as AstNode}, parsing::ast::types::{ImportPath, ImportStatement, Node as AstNode},
walk::{Node, Visitable}, walk::{Node, Visitable},
ExecState, ExecutorContext, KclError, ModuleId, SourceRange,
}; };
/// Specific dependency between two modules. The 0th element of this info /// Specific dependency between two modules. The 0th element of this info
@ -147,7 +147,7 @@ fn import_dependencies(
ret.lock() ret.lock()
.map_err(|err| { .map_err(|err| {
KclError::new_internal(KclErrorDetails::new( KclError::new_internal(KclErrorDetails::new(
format!("Failed to lock mutex: {}", err), format!("Failed to lock mutex: {err}"),
Default::default(), Default::default(),
)) ))
})? })?
@ -157,7 +157,7 @@ fn import_dependencies(
ret.lock() ret.lock()
.map_err(|err| { .map_err(|err| {
KclError::new_internal(KclErrorDetails::new( KclError::new_internal(KclErrorDetails::new(
format!("Failed to lock mutex: {}", err), format!("Failed to lock mutex: {err}"),
Default::default(), Default::default(),
)) ))
})? })?
@ -179,7 +179,7 @@ fn import_dependencies(
let ret = ret.lock().map_err(|err| { let ret = ret.lock().map_err(|err| {
KclError::new_internal(KclErrorDetails::new( KclError::new_internal(KclErrorDetails::new(
format!("Failed to lock mutex: {}", err), format!("Failed to lock mutex: {err}"),
Default::default(), Default::default(),
)) ))
})?; })?;
@ -224,7 +224,7 @@ pub(crate) async fn import_universe(
let repr = { let repr = {
let Some(module_info) = exec_state.get_module(module_id) else { let Some(module_info) = exec_state.get_module(module_id) else {
return Err(KclError::new_internal(KclErrorDetails::new( return Err(KclError::new_internal(KclErrorDetails::new(
format!("Module {} not found", module_id), format!("Module {module_id} not found"),
vec![import_stmt.into()], vec![import_stmt.into()],
))); )));
}; };
@ -244,9 +244,7 @@ mod tests {
use crate::parsing::ast::types::{ImportSelector, Program}; use crate::parsing::ast::types::{ImportSelector, Program};
macro_rules! kcl { macro_rules! kcl {
( $kcl:expr ) => {{ ( $kcl:expr_2021 ) => {{ $crate::parsing::top_level_parse($kcl).unwrap() }};
$crate::parsing::top_level_parse($kcl).unwrap()
}};
} }
fn into_module_info(program: AstNode<Program>) -> DependencyInfo { fn into_module_info(program: AstNode<Program>) -> DependencyInfo {

View File

@ -5,18 +5,18 @@ use schemars::JsonSchema;
use serde::Serialize; use serde::Serialize;
use crate::{ use crate::{
CompilationError, KclError, ModuleId, SourceRange,
errors::KclErrorDetails, errors::KclErrorDetails,
execution::{ execution::{
annotations::{SETTINGS, SETTINGS_UNIT_LENGTH},
types::{NumericType, PrimitiveType, RuntimeType, UnitLen},
EnvironmentRef, ExecState, Face, Geometry, GeometryWithImportedGeometry, Helix, ImportedGeometry, MetaSettings, EnvironmentRef, ExecState, Face, Geometry, GeometryWithImportedGeometry, Helix, ImportedGeometry, MetaSettings,
Metadata, Plane, Sketch, Solid, TagIdentifier, Metadata, Plane, Sketch, Solid, TagIdentifier,
annotations::{SETTINGS, SETTINGS_UNIT_LENGTH},
types::{NumericType, PrimitiveType, RuntimeType, UnitLen},
}, },
parsing::ast::types::{ parsing::ast::types::{
DefaultParamVal, FunctionExpression, KclNone, Literal, LiteralValue, Node, TagDeclarator, TagNode, DefaultParamVal, FunctionExpression, KclNone, Literal, LiteralValue, Node, TagDeclarator, TagNode,
}, },
std::{args::TyF64, StdFnProps}, std::{StdFnProps, args::TyF64},
CompilationError, KclError, ModuleId, SourceRange,
}; };
pub type KclObjectFields = HashMap<String, KclValue>; pub type KclObjectFields = HashMap<String, KclValue>;
@ -136,9 +136,9 @@ impl JsonSchema for FunctionSource {
"FunctionSource".to_owned() "FunctionSource".to_owned()
} }
fn json_schema(gen: &mut schemars::gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(r#gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
// TODO: Actually generate a reasonable schema. // TODO: Actually generate a reasonable schema.
gen.subschema_for::<()>() r#gen.subschema_for::<()>()
} }
} }
@ -415,15 +415,41 @@ impl KclValue {
/// Put the point into a KCL value. /// Put the point into a KCL value.
pub fn from_point2d(p: [f64; 2], ty: NumericType, meta: Vec<Metadata>) -> Self { pub fn from_point2d(p: [f64; 2], ty: NumericType, meta: Vec<Metadata>) -> Self {
let [x, y] = p;
Self::Tuple { Self::Tuple {
value: vec![ value: vec![
Self::Number { Self::Number {
value: p[0], value: x,
meta: meta.clone(), meta: meta.clone(),
ty: ty.clone(), ty,
}, },
Self::Number { Self::Number {
value: p[1], value: y,
meta: meta.clone(),
ty,
},
],
meta,
}
}
/// Put the point into a KCL value.
pub fn from_point3d(p: [f64; 3], ty: NumericType, meta: Vec<Metadata>) -> Self {
let [x, y, z] = p;
Self::Tuple {
value: vec![
Self::Number {
value: x,
meta: meta.clone(),
ty,
},
Self::Number {
value: y,
meta: meta.clone(),
ty,
},
Self::Number {
value: z,
meta: meta.clone(), meta: meta.clone(),
ty, ty,
}, },
@ -448,7 +474,7 @@ impl KclValue {
pub fn as_int_with_ty(&self) -> Option<(i64, NumericType)> { pub fn as_int_with_ty(&self) -> Option<(i64, NumericType)> {
match self { match self {
KclValue::Number { value, ty, .. } => crate::try_f64_to_i64(*value).map(|i| (i, ty.clone())), KclValue::Number { value, ty, .. } => crate::try_f64_to_i64(*value).map(|i| (i, *ty)),
_ => None, _ => None,
} }
} }
@ -562,7 +588,7 @@ impl KclValue {
pub fn as_ty_f64(&self) -> Option<TyF64> { pub fn as_ty_f64(&self) -> Option<TyF64> {
match self { match self {
KclValue::Number { value, ty, .. } => Some(TyF64::new(*value, ty.clone())), KclValue::Number { value, ty, .. } => Some(TyF64::new(*value, *ty)),
_ => None, _ => None,
} }
} }
@ -587,7 +613,7 @@ impl KclValue {
match self { match self {
KclValue::TagIdentifier(t) => Ok(*t.clone()), KclValue::TagIdentifier(t) => Ok(*t.clone()),
_ => Err(KclError::new_semantic(KclErrorDetails::new( _ => Err(KclError::new_semantic(KclErrorDetails::new(
format!("Not a tag identifier: {:?}", self), format!("Not a tag identifier: {self:?}"),
self.clone().into(), self.clone().into(),
))), ))),
} }
@ -598,7 +624,7 @@ impl KclValue {
match self { match self {
KclValue::TagDeclarator(t) => Ok((**t).clone()), KclValue::TagDeclarator(t) => Ok((**t).clone()),
_ => Err(KclError::new_semantic(KclErrorDetails::new( _ => Err(KclError::new_semantic(KclErrorDetails::new(
format!("Not a tag declarator: {:?}", self), format!("Not a tag declarator: {self:?}"),
self.clone().into(), self.clone().into(),
))), ))),
} }

View File

@ -207,8 +207,8 @@ use std::{
fmt, fmt,
pin::Pin, pin::Pin,
sync::{ sync::{
atomic::{AtomicBool, AtomicUsize, Ordering},
Arc, Arc,
atomic::{AtomicBool, AtomicUsize, Ordering},
}, },
}; };
@ -489,7 +489,7 @@ impl ProgramMemory {
} }
Err(KclError::new_undefined_value( Err(KclError::new_undefined_value(
KclErrorDetails::new(format!("`{}` is not defined", var), vec![]), KclErrorDetails::new(format!("`{var}` is not defined"), vec![]),
Some(var.to_owned()), Some(var.to_owned()),
)) ))
} }
@ -541,22 +541,6 @@ impl Stack {
self.push_new_env_for_call(snapshot); self.push_new_env_for_call(snapshot);
} }
/// Push a new stack frame on to the call stack for callees which should not read or write
/// from memory.
///
/// This is suitable for calling standard library functions or other functions written in Rust
/// which will use 'Rust memory' rather than KCL's memory and cannot reach into the wider
/// environment.
///
/// Trying to read or write from this environment will panic with an index out of bounds.
pub fn push_new_env_for_rust_call(&mut self) {
self.call_stack.push(self.current_env);
// Rust functions shouldn't try to set or access anything in their environment, so don't
// waste time and space on a new env. Using usize::MAX means we'll get an overflow if we
// try to access anything rather than a silent error.
self.current_env = EnvironmentRef(usize::MAX, 0);
}
/// Push a new stack frame on to the call stack with no connection to a parent environment. /// Push a new stack frame on to the call stack with no connection to a parent environment.
/// ///
/// Suitable for executing a separate module. /// Suitable for executing a separate module.
@ -647,7 +631,7 @@ impl Stack {
let env = self.memory.get_env(self.current_env.index()); let env = self.memory.get_env(self.current_env.index());
if env.contains_key(&key) { if env.contains_key(&key) {
return Err(KclError::new_value_already_defined(KclErrorDetails::new( return Err(KclError::new_value_already_defined(KclErrorDetails::new(
format!("Cannot redefine `{}`", key), format!("Cannot redefine `{key}`"),
vec![source_range], vec![source_range],
))); )));
} }
@ -683,7 +667,7 @@ impl Stack {
env.contains_key(var) env.contains_key(var)
} }
/// Get a key from the first KCL (i.e., non-Rust) stack frame on the call stack. /// Get a key from the first stack frame on the call stack.
pub fn get_from_call_stack(&self, key: &str, source_range: SourceRange) -> Result<(usize, &KclValue), KclError> { pub fn get_from_call_stack(&self, key: &str, source_range: SourceRange) -> Result<(usize, &KclValue), KclError> {
if !self.current_env.skip_env() { if !self.current_env.skip_env() {
return Ok((self.current_env.1, self.get(key, source_range)?)); return Ok((self.current_env.1, self.get(key, source_range)?));
@ -695,7 +679,7 @@ impl Stack {
} }
} }
unreachable!("It can't be Rust frames all the way down"); unreachable!("No frames on the stack?");
} }
/// Iterate over all keys in the current environment which satisfy the provided predicate. /// Iterate over all keys in the current environment which satisfy the provided predicate.
@ -1047,7 +1031,7 @@ mod env {
} }
/// Take all bindings from the environment. /// Take all bindings from the environment.
pub(super) fn take_bindings(self: Pin<&mut Self>) -> impl Iterator<Item = (String, (usize, KclValue))> { pub(super) fn take_bindings(self: Pin<&mut Self>) -> impl Iterator<Item = (String, (usize, KclValue))> + use<> {
// SAFETY: caller must have unique access since self is mut. We're not moving or invalidating `self`. // SAFETY: caller must have unique access since self is mut. We're not moving or invalidating `self`.
let bindings = std::mem::take(unsafe { self.bindings.get().as_mut().unwrap() }); let bindings = std::mem::take(unsafe { self.bindings.get().as_mut().unwrap() });
bindings.into_iter() bindings.into_iter()
@ -1217,24 +1201,6 @@ mod test {
assert_get_from(mem, "c", 5, callee); assert_get_from(mem, "c", 5, callee);
} }
#[test]
fn rust_env() {
let mem = &mut Stack::new_for_tests();
mem.add("a".to_owned(), val(1), sr()).unwrap();
mem.add("b".to_owned(), val(3), sr()).unwrap();
let sn = mem.snapshot();
mem.push_new_env_for_rust_call();
mem.push_new_env_for_call(sn);
assert_get(mem, "b", 3);
mem.add("b".to_owned(), val(4), sr()).unwrap();
assert_get(mem, "b", 4);
mem.pop_env();
mem.pop_env();
assert_get(mem, "b", 3);
}
#[test] #[test]
fn deep_call_env() { fn deep_call_env() {
let mem = &mut Stack::new_for_tests(); let mem = &mut Stack::new_for_tests();

View File

@ -16,10 +16,9 @@ pub(crate) use import::PreImportedGeometry;
use indexmap::IndexMap; use indexmap::IndexMap;
pub use kcl_value::{KclObjectFields, KclValue}; pub use kcl_value::{KclObjectFields, KclValue};
use kcmc::{ use kcmc::{
each_cmd as mcmd, ImageFormat, ModelingCmd, each_cmd as mcmd,
ok_response::{output::TakeSnapshot, OkModelingCmdResponse}, ok_response::{OkModelingCmdResponse, output::TakeSnapshot},
websocket::{ModelingSessionData, OkWebSocketResponseData}, websocket::{ModelingSessionData, OkWebSocketResponseData},
ImageFormat, ModelingCmd,
}; };
use kittycad_modeling_cmds::{self as kcmc, id::ModelingCmdId}; use kittycad_modeling_cmds::{self as kcmc, id::ModelingCmdId};
pub use memory::EnvironmentRef; pub use memory::EnvironmentRef;
@ -31,6 +30,7 @@ pub use state::{ExecState, MetaSettings};
use uuid::Uuid; use uuid::Uuid;
use crate::{ use crate::{
CompilationError, ExecError, KclErrorWithOutputs,
engine::{EngineManager, GridScaleBehavior}, engine::{EngineManager, GridScaleBehavior},
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::{ execution::{
@ -43,7 +43,6 @@ use crate::{
modules::{ModuleId, ModulePath, ModuleRepr}, modules::{ModuleId, ModulePath, ModuleRepr},
parsing::ast::types::{Expr, ImportPath, NodeRef}, parsing::ast::types::{Expr, ImportPath, NodeRef},
source_range::SourceRange, source_range::SourceRange,
CompilationError, ExecError, KclErrorWithOutputs,
}; };
pub(crate) mod annotations; pub(crate) mod annotations;
@ -1329,7 +1328,7 @@ impl ExecutorContext {
created: if deterministic_time { created: if deterministic_time {
Some("2021-01-01T00:00:00Z".parse().map_err(|e| { Some("2021-01-01T00:00:00Z".parse().map_err(|e| {
KclError::new_internal(crate::errors::KclErrorDetails::new( KclError::new_internal(crate::errors::KclErrorDetails::new(
format!("Failed to parse date: {}", e), format!("Failed to parse date: {e}"),
vec![SourceRange::default()], vec![SourceRange::default()],
)) ))
})?) })?)
@ -1409,7 +1408,7 @@ pub(crate) async fn parse_execute_with_project_dir(
engine: Arc::new(Box::new( engine: Arc::new(Box::new(
crate::engine::conn_mock::EngineConnection::new().await.map_err(|err| { crate::engine::conn_mock::EngineConnection::new().await.map_err(|err| {
KclError::new_internal(crate::errors::KclErrorDetails::new( KclError::new_internal(crate::errors::KclErrorDetails::new(
format!("Failed to create mock engine connection: {}", err), format!("Failed to create mock engine connection: {err}"),
vec![SourceRange::default()], vec![SourceRange::default()],
)) ))
})?, })?,
@ -1446,7 +1445,7 @@ mod tests {
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
use super::*; use super::*;
use crate::{errors::KclErrorDetails, execution::memory::Stack, ModuleId}; use crate::{ModuleId, errors::KclErrorDetails, execution::memory::Stack};
/// Convenience function to get a JSON value from memory and unwrap. /// Convenience function to get a JSON value from memory and unwrap.
#[track_caller] #[track_caller]
@ -1921,6 +1920,22 @@ shape = layer() |> patternTransform(instances = 10, transform = transform)
); );
} }
#[tokio::test(flavor = "multi_thread")]
async fn pass_std_to_std() {
let ast = r#"sketch001 = startSketchOn(XY)
profile001 = circle(sketch001, center = [0, 0], radius = 2)
extrude001 = extrude(profile001, length = 5)
extrudes = patternLinear3d(
extrude001,
instances = 3,
distance = 5,
axis = [1, 1, 0],
)
clone001 = map(extrudes, f = clone)
"#;
parse_execute(ast).await.unwrap();
}
#[tokio::test(flavor = "multi_thread")] #[tokio::test(flavor = "multi_thread")]
async fn test_zero_param_fn() { async fn test_zero_param_fn() {
let ast = r#"sigmaAllow = 35000 // psi let ast = r#"sigmaAllow = 35000 // psi
@ -2045,8 +2060,7 @@ notFunction = !x";
fn_err fn_err
.message() .message()
.starts_with("Cannot apply unary operator ! to non-boolean value: "), .starts_with("Cannot apply unary operator ! to non-boolean value: "),
"Actual error: {:?}", "Actual error: {fn_err:?}"
fn_err
); );
let code8 = " let code8 = "
@ -2059,8 +2073,7 @@ notTagDeclarator = !myTagDeclarator";
tag_declarator_err tag_declarator_err
.message() .message()
.starts_with("Cannot apply unary operator ! to non-boolean value: a tag declarator"), .starts_with("Cannot apply unary operator ! to non-boolean value: a tag declarator"),
"Actual error: {:?}", "Actual error: {tag_declarator_err:?}"
tag_declarator_err
); );
let code9 = " let code9 = "
@ -2073,8 +2086,7 @@ notTagIdentifier = !myTag";
tag_identifier_err tag_identifier_err
.message() .message()
.starts_with("Cannot apply unary operator ! to non-boolean value: a tag identifier"), .starts_with("Cannot apply unary operator ! to non-boolean value: a tag identifier"),
"Actual error: {:?}", "Actual error: {tag_identifier_err:?}"
tag_identifier_err
); );
let code10 = "notPipe = !(1 |> 2)"; let code10 = "notPipe = !(1 |> 2)";
@ -2226,7 +2238,7 @@ w = f() + f()
if let Err(err) = ctx.run_with_caching(old_program).await { if let Err(err) = ctx.run_with_caching(old_program).await {
let report = err.into_miette_report_with_outputs(code).unwrap(); let report = err.into_miette_report_with_outputs(code).unwrap();
let report = miette::Report::new(report); let report = miette::Report::new(report);
panic!("Error executing program: {:?}", report); panic!("Error executing program: {report:?}");
} }
// Get the id_generator from the first execution. // Get the id_generator from the first execution.

View File

@ -8,10 +8,10 @@ use uuid::Uuid;
#[cfg(feature = "artifact-graph")] #[cfg(feature = "artifact-graph")]
use crate::exec::ArtifactCommand; use crate::exec::ArtifactCommand;
use crate::{ use crate::{
ExecState, ExecutorContext, KclError, SourceRange,
exec::{IdGenerator, KclValue}, exec::{IdGenerator, KclValue},
execution::Solid, execution::Solid,
std::Args, std::Args,
ExecState, ExecutorContext, KclError, SourceRange,
}; };
/// Context and metadata needed to send a single modeling command. /// Context and metadata needed to send a single modeling command.

View File

@ -9,20 +9,19 @@ use uuid::Uuid;
#[cfg(feature = "artifact-graph")] #[cfg(feature = "artifact-graph")]
use crate::execution::{Artifact, ArtifactCommand, ArtifactGraph, ArtifactId}; use crate::execution::{Artifact, ArtifactCommand, ArtifactGraph, ArtifactId};
use crate::{ use crate::{
CompilationError, EngineManager, ExecutorContext, KclErrorWithOutputs,
errors::{KclError, KclErrorDetails, Severity}, errors::{KclError, KclErrorDetails, Severity},
exec::DefaultPlanes, exec::DefaultPlanes,
execution::{ execution::{
annotations, EnvironmentRef, ExecOutcome, ExecutorSettings, KclValue, UnitAngle, UnitLen, annotations,
cad_op::Operation, cad_op::Operation,
id_generator::IdGenerator, id_generator::IdGenerator,
memory::{ProgramMemory, Stack}, memory::{ProgramMemory, Stack},
types::{self, NumericType}, types::{self, NumericType},
EnvironmentRef, ExecOutcome, ExecutorSettings, KclValue, UnitAngle, UnitLen,
}, },
modules::{ModuleId, ModuleInfo, ModuleLoader, ModulePath, ModuleRepr, ModuleSource}, modules::{ModuleId, ModuleInfo, ModuleLoader, ModulePath, ModuleRepr, ModuleSource},
parsing::ast::types::{Annotation, NodeRef}, parsing::ast::types::{Annotation, NodeRef},
source_range::SourceRange, source_range::SourceRange,
CompilationError, EngineManager, ExecutorContext, KclErrorWithOutputs,
}; };
/// State for executing a program. /// State for executing a program.
@ -555,7 +554,7 @@ impl MetaSettings {
annotations::SETTINGS_UNIT_ANGLE annotations::SETTINGS_UNIT_ANGLE
), ),
vec![annotation.as_source_range()], vec![annotation.as_source_range()],
))) )));
} }
} }
} }

View File

@ -220,9 +220,9 @@ impl schemars::JsonSchema for TypedPath {
"TypedPath".to_owned() "TypedPath".to_owned()
} }
fn json_schema(gen: &mut schemars::gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(r#gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
// TODO: Actually generate a reasonable schema. // TODO: Actually generate a reasonable schema.
gen.subschema_for::<std::path::PathBuf>() r#gen.subschema_for::<std::path::PathBuf>()
} }
} }

View File

@ -5,17 +5,17 @@ use schemars::JsonSchema;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use crate::{ use crate::{
CompilationError, SourceRange,
execution::{ execution::{
ExecState, Plane, PlaneInfo, Point3d,
kcl_value::{KclValue, TypeDef}, kcl_value::{KclValue, TypeDef},
memory::{self}, memory::{self},
ExecState, Plane, PlaneInfo, Point3d,
}, },
parsing::{ parsing::{
ast::types::{PrimitiveType as AstPrimitiveType, Type}, ast::types::{PrimitiveType as AstPrimitiveType, Type},
token::NumericSuffix, token::NumericSuffix,
}, },
std::args::{FromKclValue, TyF64}, std::args::{FromKclValue, TyF64},
CompilationError, SourceRange,
}; };
#[derive(Debug, Clone, PartialEq)] #[derive(Debug, Clone, PartialEq)]
@ -210,7 +210,7 @@ impl RuntimeType {
let ty_val = exec_state let ty_val = exec_state
.stack() .stack()
.get(&format!("{}{}", memory::TYPE_PREFIX, alias), source_range) .get(&format!("{}{}", memory::TYPE_PREFIX, alias), source_range)
.map_err(|_| CompilationError::err(source_range, format!("Unknown type: {}", alias)))?; .map_err(|_| CompilationError::err(source_range, format!("Unknown type: {alias}")))?;
Ok(match ty_val { Ok(match ty_val {
KclValue::Type { value, .. } => match value { KclValue::Type { value, .. } => match value {
@ -241,7 +241,7 @@ impl RuntimeType {
"a tuple with values of types ({})", "a tuple with values of types ({})",
tys.iter().map(Self::human_friendly_type).collect::<Vec<_>>().join(", ") tys.iter().map(Self::human_friendly_type).collect::<Vec<_>>().join(", ")
), ),
RuntimeType::Object(_) => format!("an object with fields {}", self), RuntimeType::Object(_) => format!("an object with fields {self}"),
} }
} }
@ -460,7 +460,7 @@ impl fmt::Display for PrimitiveType {
} }
} }
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, ts_rs::TS, JsonSchema)] #[derive(Debug, Clone, Copy, Serialize, Deserialize, PartialEq, ts_rs::TS, JsonSchema)]
#[ts(export)] #[ts(export)]
#[serde(tag = "type")] #[serde(tag = "type")]
pub enum NumericType { pub enum NumericType {
@ -575,7 +575,7 @@ impl NumericType {
match (&ty, &i.ty) { match (&ty, &i.ty) {
(Any, Default { .. }) if i.n == 0.0 => {} (Any, Default { .. }) if i.n == 0.0 => {}
(Any, t) => { (Any, t) => {
ty = t.clone(); ty = *t;
} }
(_, Unknown) | (Default { .. }, Default { .. }) => return (result, Unknown), (_, Unknown) | (Default { .. }, Default { .. }) => return (result, Unknown),
@ -598,7 +598,7 @@ impl NumericType {
} }
if ty == Any && !input.is_empty() { if ty == Any && !input.is_empty() {
ty = input[0].ty.clone(); ty = input[0].ty;
} }
(result, ty) (result, ty)
@ -722,7 +722,7 @@ impl NumericType {
if ty.subtype(self) { if ty.subtype(self) {
return Ok(KclValue::Number { return Ok(KclValue::Number {
value: *value, value: *value,
ty: ty.clone(), ty: *ty,
meta: meta.clone(), meta: meta.clone(),
}); });
} }
@ -736,7 +736,7 @@ impl NumericType {
(Any, _) => Ok(KclValue::Number { (Any, _) => Ok(KclValue::Number {
value: *value, value: *value,
ty: self.clone(), ty: *self,
meta: meta.clone(), meta: meta.clone(),
}), }),
@ -744,7 +744,7 @@ impl NumericType {
// means accept any number rather than force the current default. // means accept any number rather than force the current default.
(_, Default { .. }) => Ok(KclValue::Number { (_, Default { .. }) => Ok(KclValue::Number {
value: *value, value: *value,
ty: ty.clone(), ty: *ty,
meta: meta.clone(), meta: meta.clone(),
}), }),
@ -840,6 +840,18 @@ pub enum UnitType {
Angle(UnitAngle), Angle(UnitAngle),
} }
impl UnitType {
pub(crate) fn to_suffix(self) -> Option<String> {
match self {
UnitType::Count => Some("_".to_owned()),
UnitType::Length(UnitLen::Unknown) => None,
UnitType::Angle(UnitAngle::Unknown) => None,
UnitType::Length(l) => Some(l.to_string()),
UnitType::Angle(a) => Some(a.to_string()),
}
}
}
impl std::fmt::Display for UnitType { impl std::fmt::Display for UnitType {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self { match self {
@ -1479,7 +1491,7 @@ impl KclValue {
pub fn principal_type(&self) -> Option<RuntimeType> { pub fn principal_type(&self) -> Option<RuntimeType> {
match self { match self {
KclValue::Bool { .. } => Some(RuntimeType::Primitive(PrimitiveType::Boolean)), KclValue::Bool { .. } => Some(RuntimeType::Primitive(PrimitiveType::Boolean)),
KclValue::Number { ty, .. } => Some(RuntimeType::Primitive(PrimitiveType::Number(ty.clone()))), KclValue::Number { ty, .. } => Some(RuntimeType::Primitive(PrimitiveType::Number(*ty))),
KclValue::String { .. } => Some(RuntimeType::Primitive(PrimitiveType::String)), KclValue::String { .. } => Some(RuntimeType::Primitive(PrimitiveType::String)),
KclValue::Object { value, .. } => { KclValue::Object { value, .. } => {
let properties = value let properties = value
@ -1529,7 +1541,7 @@ impl KclValue {
#[cfg(test)] #[cfg(test)]
mod test { mod test {
use super::*; use super::*;
use crate::execution::{parse_execute, ExecTestResults}; use crate::execution::{ExecTestResults, parse_execute};
fn values(exec_state: &mut ExecState) -> Vec<KclValue> { fn values(exec_state: &mut ExecState) -> Vec<KclValue> {
vec![ vec![
@ -1975,14 +1987,16 @@ mod test {
]) ])
) )
); );
assert!(RuntimeType::Union(vec![ assert!(
RuntimeType::Primitive(PrimitiveType::Number(NumericType::Any)), RuntimeType::Union(vec![
RuntimeType::Primitive(PrimitiveType::Boolean) RuntimeType::Primitive(PrimitiveType::Number(NumericType::Any)),
]) RuntimeType::Primitive(PrimitiveType::Boolean)
.subtype(&RuntimeType::Union(vec![ ])
RuntimeType::Primitive(PrimitiveType::Number(NumericType::Any)), .subtype(&RuntimeType::Union(vec![
RuntimeType::Primitive(PrimitiveType::Boolean) RuntimeType::Primitive(PrimitiveType::Number(NumericType::Any)),
]))); RuntimeType::Primitive(PrimitiveType::Boolean)
]))
);
// Covariance // Covariance
let count = KclValue::Number { let count = KclValue::Number {

View File

@ -45,6 +45,31 @@ pub fn format_number_literal(value: f64, suffix: NumericSuffix) -> Result<String
} }
} }
#[derive(Debug, Clone, PartialEq, Serialize, thiserror::Error)]
#[serde(tag = "type")]
pub enum FormatNumericTypeError {
#[error("Invalid numeric type: {0:?}")]
Invalid(NumericType),
}
/// For UI code generation, format a number value with a suffix such that the
/// result can parse as a literal. If it can't be done, returns an error.
///
/// This is used by TS.
pub fn format_number_value(value: f64, ty: NumericType) -> Result<String, FormatNumericTypeError> {
match ty {
NumericType::Default { .. } => Ok(value.to_string()),
// There isn't a syntactic suffix for these. For unknown, we don't want
// to ever generate the unknown suffix. We currently warn on it, and we
// may remove it in the future.
NumericType::Unknown | NumericType::Any => Err(FormatNumericTypeError::Invalid(ty)),
NumericType::Known(unit_type) => unit_type
.to_suffix()
.map(|suffix| format!("{value}{suffix}"))
.ok_or(FormatNumericTypeError::Invalid(ty)),
}
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
@ -134,4 +159,74 @@ mod tests {
Err(FormatNumericSuffixError::Invalid(NumericSuffix::Unknown)) Err(FormatNumericSuffixError::Invalid(NumericSuffix::Unknown))
); );
} }
#[test]
fn test_format_number_value() {
assert_eq!(
format_number_value(
1.0,
NumericType::Default {
len: Default::default(),
angle: Default::default()
}
),
Ok("1".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Length(UnitLen::Unknown))),
Err(FormatNumericTypeError::Invalid(NumericType::Known(UnitType::Length(
UnitLen::Unknown
))))
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Angle(UnitAngle::Unknown))),
Err(FormatNumericTypeError::Invalid(NumericType::Known(UnitType::Angle(
UnitAngle::Unknown
))))
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Count)),
Ok("1_".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Length(UnitLen::Mm))),
Ok("1mm".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Length(UnitLen::Cm))),
Ok("1cm".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Length(UnitLen::M))),
Ok("1m".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Length(UnitLen::Inches))),
Ok("1in".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Length(UnitLen::Feet))),
Ok("1ft".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Length(UnitLen::Yards))),
Ok("1yd".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Angle(UnitAngle::Degrees))),
Ok("1deg".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Known(UnitType::Angle(UnitAngle::Radians))),
Ok("1rad".to_owned())
);
assert_eq!(
format_number_value(1.0, NumericType::Unknown),
Err(FormatNumericTypeError::Invalid(NumericType::Unknown))
);
assert_eq!(
format_number_value(1.0, NumericType::Any),
Err(FormatNumericTypeError::Invalid(NumericType::Any))
);
}
} }

View File

@ -3,10 +3,10 @@
use anyhow::Result; use anyhow::Result;
use crate::{ use crate::{
SourceRange,
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::typed_path::TypedPath, execution::typed_path::TypedPath,
fs::FileSystem, fs::FileSystem,
SourceRange,
}; };
#[derive(Debug, Clone)] #[derive(Debug, Clone)]

View File

@ -2,7 +2,7 @@
use anyhow::Result; use anyhow::Result;
use crate::{execution::typed_path::TypedPath, SourceRange}; use crate::{SourceRange, execution::typed_path::TypedPath};
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
pub mod local; pub mod local;

View File

@ -4,11 +4,11 @@ use anyhow::Result;
use wasm_bindgen::prelude::wasm_bindgen; use wasm_bindgen::prelude::wasm_bindgen;
use crate::{ use crate::{
SourceRange,
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
execution::typed_path::TypedPath, execution::typed_path::TypedPath,
fs::FileSystem, fs::FileSystem,
wasm::JsFuture, wasm::JsFuture,
SourceRange,
}; };
#[wasm_bindgen(module = "/../../src/lang/std/fileSystemManager.ts")] #[wasm_bindgen(module = "/../../src/lang/std/fileSystemManager.ts")]

View File

@ -90,10 +90,9 @@ pub use errors::{
ReportWithOutputs, ReportWithOutputs,
}; };
pub use execution::{ pub use execution::{
bust_cache, clear_mem_cache, ExecOutcome, ExecState, ExecutorContext, ExecutorSettings, MetaSettings, Point2d, bust_cache, clear_mem_cache,
typed_path::TypedPath, typed_path::TypedPath,
types::{UnitAngle, UnitLen}, types::{UnitAngle, UnitLen},
ExecOutcome, ExecState, ExecutorContext, ExecutorSettings, MetaSettings, Point2d,
}; };
pub use lsp::{ pub use lsp::{
copilot::Backend as CopilotLspBackend, copilot::Backend as CopilotLspBackend,
@ -101,7 +100,7 @@ pub use lsp::{
}; };
pub use modules::ModuleId; pub use modules::ModuleId;
pub use parsing::ast::types::{FormatOptions, NodePath, Step as NodePathStep}; pub use parsing::ast::types::{FormatOptions, NodePath, Step as NodePathStep};
pub use settings::types::{project::ProjectConfiguration, Configuration, UnitLength}; pub use settings::types::{Configuration, UnitLength, project::ProjectConfiguration};
pub use source_range::SourceRange; pub use source_range::SourceRange;
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
pub use unparser::{recast_dir, walk_dir}; pub use unparser::{recast_dir, walk_dir};
@ -109,12 +108,12 @@ pub use unparser::{recast_dir, walk_dir};
// Rather than make executor public and make lots of it pub(crate), just re-export into a new module. // Rather than make executor public and make lots of it pub(crate), just re-export into a new module.
// Ideally we wouldn't export these things at all, they should only be used for testing. // Ideally we wouldn't export these things at all, they should only be used for testing.
pub mod exec { pub mod exec {
pub use crate::execution::{
types::{NumericType, UnitAngle, UnitLen, UnitType},
DefaultPlanes, IdGenerator, KclValue, PlaneType, Sketch,
};
#[cfg(feature = "artifact-graph")] #[cfg(feature = "artifact-graph")]
pub use crate::execution::{ArtifactCommand, Operation}; pub use crate::execution::{ArtifactCommand, Operation};
pub use crate::execution::{
DefaultPlanes, IdGenerator, KclValue, PlaneType, Sketch,
types::{NumericType, UnitAngle, UnitLen, UnitType},
};
} }
#[cfg(target_arch = "wasm32")] #[cfg(target_arch = "wasm32")]
@ -136,12 +135,12 @@ pub mod native_engine {
} }
pub mod std_utils { pub mod std_utils {
pub use crate::std::utils::{get_tangential_arc_to_info, is_points_ccw_wasm, TangentialArcInfoInput}; pub use crate::std::utils::{TangentialArcInfoInput, get_tangential_arc_to_info, is_points_ccw_wasm};
} }
pub mod pretty { pub mod pretty {
pub use crate::{ pub use crate::{
fmt::{format_number_literal, human_display_number}, fmt::{format_number_literal, format_number_value, human_display_number},
parsing::token::NumericSuffix, parsing::token::NumericSuffix,
}; };
} }
@ -160,7 +159,7 @@ lazy_static::lazy_static! {
#[cfg(feature = "cli")] #[cfg(feature = "cli")]
let named_extensions = kittycad::types::FileImportFormat::value_variants() let named_extensions = kittycad::types::FileImportFormat::value_variants()
.iter() .iter()
.map(|x| format!("{}", x)) .map(|x| format!("{x}"))
.collect::<Vec<String>>(); .collect::<Vec<String>>();
#[cfg(not(feature = "cli"))] #[cfg(not(feature = "cli"))]
let named_extensions = vec![]; // We don't really need this outside of the CLI. let named_extensions = vec![]; // We don't really need this outside of the CLI.
@ -276,41 +275,25 @@ impl Program {
#[inline] #[inline]
fn try_f64_to_usize(f: f64) -> Option<usize> { fn try_f64_to_usize(f: f64) -> Option<usize> {
let i = f as usize; let i = f as usize;
if i as f64 == f { if i as f64 == f { Some(i) } else { None }
Some(i)
} else {
None
}
} }
#[inline] #[inline]
fn try_f64_to_u32(f: f64) -> Option<u32> { fn try_f64_to_u32(f: f64) -> Option<u32> {
let i = f as u32; let i = f as u32;
if i as f64 == f { if i as f64 == f { Some(i) } else { None }
Some(i)
} else {
None
}
} }
#[inline] #[inline]
fn try_f64_to_u64(f: f64) -> Option<u64> { fn try_f64_to_u64(f: f64) -> Option<u64> {
let i = f as u64; let i = f as u64;
if i as f64 == f { if i as f64 == f { Some(i) } else { None }
Some(i)
} else {
None
}
} }
#[inline] #[inline]
fn try_f64_to_i64(f: f64) -> Option<i64> { fn try_f64_to_i64(f: f64) -> Option<i64> {
let i = f as i64; let i = f as i64;
if i as f64 == f { if i as f64 == f { Some(i) } else { None }
Some(i)
} else {
None
}
} }
/// Get the version of the KCL library. /// Get the version of the KCL library.

View File

@ -2,19 +2,19 @@ use anyhow::Result;
use convert_case::Casing; use convert_case::Casing;
use crate::{ use crate::{
SourceRange,
errors::Suggestion, errors::Suggestion,
lint::rule::{def_finding, Discovered, Finding}, lint::rule::{Discovered, Finding, def_finding},
parsing::ast::types::{Node as AstNode, ObjectProperty, Program, VariableDeclarator}, parsing::ast::types::{Node as AstNode, ObjectProperty, Program, VariableDeclarator},
walk::Node, walk::Node,
SourceRange,
}; };
def_finding!( def_finding!(
Z0001, Z0001,
"Identifiers must be lowerCamelCase", "Identifiers should be lowerCamelCase",
"\ "\
By convention, variable names are lowerCamelCase, not snake_case, kebab-case, By convention, variable names are lowerCamelCase, not snake_case, kebab-case,
nor CammelCase. 🐪 nor upper CamelCase (aka PascalCase). 🐪
For instance, a good identifier for the variable representing 'box height' For instance, a good identifier for the variable representing 'box height'
would be 'boxHeight', not 'BOX_HEIGHT', 'box_height' nor 'BoxHeight'. For would be 'boxHeight', not 'BOX_HEIGHT', 'box_height' nor 'BoxHeight'. For
@ -38,12 +38,12 @@ fn lint_lower_camel_case_var(decl: &VariableDeclarator, prog: &AstNode<Program>)
let recast = prog.recast(&Default::default(), 0); let recast = prog.recast(&Default::default(), 0);
let suggestion = Suggestion { let suggestion = Suggestion {
title: format!("rename '{}' to '{}'", name, new_name), title: format!("rename '{name}' to '{new_name}'"),
insert: recast, insert: recast,
source_range: prog.as_source_range(), source_range: prog.as_source_range(),
}; };
findings.push(Z0001.at( findings.push(Z0001.at(
format!("found '{}'", name), format!("found '{name}'"),
SourceRange::new(ident.start, ident.end, ident.module_id), SourceRange::new(ident.start, ident.end, ident.module_id),
Some(suggestion.clone()), Some(suggestion.clone()),
)); ));
@ -61,7 +61,7 @@ fn lint_lower_camel_case_property(decl: &ObjectProperty, _prog: &AstNode<Program
if !name.is_case(convert_case::Case::Camel) { if !name.is_case(convert_case::Case::Camel) {
// We can't rename the properties yet. // We can't rename the properties yet.
findings.push(Z0001.at( findings.push(Z0001.at(
format!("found '{}'", name), format!("found '{name}'"),
SourceRange::new(ident.start, ident.end, ident.module_id), SourceRange::new(ident.start, ident.end, ident.module_id),
None, None,
)); ));
@ -93,7 +93,7 @@ pub fn lint_object_properties(decl: Node, prog: &AstNode<Program>) -> Result<Vec
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::{lint_object_properties, lint_variables, Z0001}; use super::{Z0001, lint_object_properties, lint_variables};
use crate::lint::rule::{assert_finding, test_finding, test_no_finding}; use crate::lint::rule::{assert_finding, test_finding, test_no_finding};
#[tokio::test] #[tokio::test]

View File

@ -4,7 +4,7 @@ use crate::{
errors::Suggestion, errors::Suggestion,
lint::{ lint::{
checks::offset_plane::start_sketch_on_check_specific_plane, checks::offset_plane::start_sketch_on_check_specific_plane,
rule::{def_finding, Discovered, Finding}, rule::{Discovered, Finding, def_finding},
}, },
parsing::ast::types::{Node as AstNode, Program}, parsing::ast::types::{Node as AstNode, Program},
walk::Node, walk::Node,
@ -33,14 +33,11 @@ pub fn lint_should_be_default_plane(node: Node, _prog: &AstNode<Program>) -> Res
} }
let suggestion = Suggestion { let suggestion = Suggestion {
title: "use defaultPlane instead".to_owned(), title: "use defaultPlane instead".to_owned(),
insert: format!("{}", plane_name), insert: format!("{plane_name}"),
source_range: call_source_range, source_range: call_source_range,
}; };
Ok(vec![Z0002.at( Ok(vec![Z0002.at(
format!( format!("custom plane in startSketchOn; defaultPlane {plane_name} would work here"),
"custom plane in startSketchOn; defaultPlane {} would work here",
plane_name
),
call_source_range, call_source_range,
Some(suggestion), Some(suggestion),
)]) )])
@ -48,7 +45,7 @@ pub fn lint_should_be_default_plane(node: Node, _prog: &AstNode<Program>) -> Res
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::{lint_should_be_default_plane, Z0002}; use super::{Z0002, lint_should_be_default_plane};
use crate::lint::rule::{test_finding, test_no_finding}; use crate::lint::rule::{test_finding, test_no_finding};
test_finding!( test_finding!(

View File

@ -2,6 +2,6 @@ mod camel_case;
mod default_plane; mod default_plane;
mod offset_plane; mod offset_plane;
pub use camel_case::{lint_object_properties, lint_variables, Z0001}; pub use camel_case::{Z0001, lint_object_properties, lint_variables};
pub use default_plane::{lint_should_be_default_plane, Z0002}; pub use default_plane::{Z0002, lint_should_be_default_plane};
pub use offset_plane::{lint_should_be_offset_plane, Z0003}; pub use offset_plane::{Z0003, lint_should_be_offset_plane};

View File

@ -1,15 +1,15 @@
use anyhow::Result; use anyhow::Result;
use crate::{ use crate::{
engine::{PlaneName, DEFAULT_PLANE_INFO}, SourceRange,
engine::{DEFAULT_PLANE_INFO, PlaneName},
errors::Suggestion, errors::Suggestion,
execution::{types::UnitLen, PlaneInfo, Point3d}, execution::{PlaneInfo, Point3d, types::UnitLen},
lint::rule::{def_finding, Discovered, Finding}, lint::rule::{Discovered, Finding, def_finding},
parsing::ast::types::{ parsing::ast::types::{
BinaryPart, CallExpressionKw, Expr, LiteralValue, Node as AstNode, ObjectExpression, Program, UnaryOperator, BinaryPart, CallExpressionKw, Expr, LiteralValue, Node as AstNode, ObjectExpression, Program, UnaryOperator,
}, },
walk::Node, walk::Node,
SourceRange,
}; };
def_finding!( def_finding!(
@ -39,14 +39,11 @@ pub fn lint_should_be_offset_plane(node: Node, _prog: &AstNode<Program>) -> Resu
} }
let suggestion = Suggestion { let suggestion = Suggestion {
title: "use offsetPlane instead".to_owned(), title: "use offsetPlane instead".to_owned(),
insert: format!("offsetPlane({}, offset = {})", plane_name, offset), insert: format!("offsetPlane({plane_name}, offset = {offset})"),
source_range: call_source_range, source_range: call_source_range,
}; };
Ok(vec![Z0003.at( Ok(vec![Z0003.at(
format!( format!("custom plane in startSketchOn; offsetPlane from {plane_name} would work here"),
"custom plane in startSketchOn; offsetPlane from {} would work here",
plane_name
),
call_source_range, call_source_range,
Some(suggestion), Some(suggestion),
)]) )])
@ -68,16 +65,16 @@ fn get_xyz(point: &ObjectExpression) -> Option<(f64, f64, f64)> {
for property in &point.properties { for property in &point.properties {
let Some(value) = (match &property.value { let Some(value) = (match &property.value {
Expr::UnaryExpression(ref value) => { Expr::UnaryExpression(value) => {
if value.operator != UnaryOperator::Neg { if value.operator != UnaryOperator::Neg {
continue; continue;
} }
let BinaryPart::Literal(ref value) = &value.inner.argument else { let BinaryPart::Literal(value) = &value.inner.argument else {
continue; continue;
}; };
unlitafy(&value.inner.value).map(|v| -v) unlitafy(&value.inner.value).map(|v| -v)
} }
Expr::Literal(ref value) => unlitafy(&value.value), Expr::Literal(value) => unlitafy(&value.value),
_ => { _ => {
continue; continue;
} }
@ -271,7 +268,7 @@ fn normalize_plane_info(plane_info: &PlaneInfo) -> PlaneInfo {
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::{lint_should_be_offset_plane, Z0003}; use super::{Z0003, lint_should_be_offset_plane};
use crate::lint::rule::{test_finding, test_no_finding}; use crate::lint::rule::{test_finding, test_no_finding};
test_finding!( test_finding!(

View File

@ -4,11 +4,11 @@ use serde::Serialize;
use tower_lsp::lsp_types::{Diagnostic, DiagnosticSeverity}; use tower_lsp::lsp_types::{Diagnostic, DiagnosticSeverity};
use crate::{ use crate::{
SourceRange,
errors::Suggestion, errors::Suggestion,
lsp::IntoDiagnostic, lsp::IntoDiagnostic,
parsing::ast::types::{Node as AstNode, Program}, parsing::ast::types::{Node as AstNode, Program},
walk::Node, walk::Node,
SourceRange,
}; };
/// Check the provided AST for any found rule violations. /// Check the provided AST for any found rule violations.
@ -180,7 +180,7 @@ impl Finding {
} }
macro_rules! def_finding { macro_rules! def_finding {
( $code:ident, $title:expr, $description:expr ) => { ( $code:ident, $title:expr_2021, $description:expr_2021 ) => {
/// Generated Finding /// Generated Finding
pub const $code: Finding = $crate::lint::rule::finding!($code, $title, $description); pub const $code: Finding = $crate::lint::rule::finding!($code, $title, $description);
}; };
@ -188,7 +188,7 @@ macro_rules! def_finding {
pub(crate) use def_finding; pub(crate) use def_finding;
macro_rules! finding { macro_rules! finding {
( $code:ident, $title:expr, $description:expr ) => { ( $code:ident, $title:expr_2021, $description:expr_2021 ) => {
$crate::lint::rule::Finding { $crate::lint::rule::Finding {
code: stringify!($code), code: stringify!($code),
title: $title, title: $title,
@ -205,7 +205,7 @@ pub(crate) use test::{assert_finding, assert_no_finding, test_finding, test_no_f
mod test { mod test {
macro_rules! assert_no_finding { macro_rules! assert_no_finding {
( $check:expr, $finding:expr, $kcl:expr ) => { ( $check:expr_2021, $finding:expr_2021, $kcl:expr_2021 ) => {
let prog = $crate::Program::parse_no_errs($kcl).unwrap(); let prog = $crate::Program::parse_no_errs($kcl).unwrap();
// Ensure the code still works. // Ensure the code still works.
@ -220,7 +220,7 @@ mod test {
} }
macro_rules! assert_finding { macro_rules! assert_finding {
( $check:expr, $finding:expr, $kcl:expr, $output:expr, $suggestion:expr ) => { ( $check:expr_2021, $finding:expr_2021, $kcl:expr_2021, $output:expr_2021, $suggestion:expr_2021 ) => {
let prog = $crate::Program::parse_no_errs($kcl).unwrap(); let prog = $crate::Program::parse_no_errs($kcl).unwrap();
// Ensure the code still works. // Ensure the code still works.
@ -250,7 +250,7 @@ mod test {
} }
macro_rules! test_finding { macro_rules! test_finding {
( $name:ident, $check:expr, $finding:expr, $kcl:expr, $output:expr, $suggestion:expr ) => { ( $name:ident, $check:expr_2021, $finding:expr_2021, $kcl:expr_2021, $output:expr_2021, $suggestion:expr_2021 ) => {
#[tokio::test] #[tokio::test]
async fn $name() { async fn $name() {
$crate::lint::rule::assert_finding!($check, $finding, $kcl, $output, $suggestion); $crate::lint::rule::assert_finding!($check, $finding, $kcl, $output, $suggestion);
@ -259,7 +259,7 @@ mod test {
} }
macro_rules! test_no_finding { macro_rules! test_no_finding {
( $name:ident, $check:expr, $finding:expr, $kcl:expr ) => { ( $name:ident, $check:expr_2021, $finding:expr_2021, $kcl:expr_2021 ) => {
#[tokio::test] #[tokio::test]
async fn $name() { async fn $name() {
$crate::lint::rule::assert_no_finding!($check, $finding, $kcl); $crate::lint::rule::assert_no_finding!($check, $finding, $kcl);

View File

@ -90,7 +90,7 @@ where
async fn do_initialized(&self, params: InitializedParams) { async fn do_initialized(&self, params: InitializedParams) {
self.client() self.client()
.log_message(MessageType::INFO, format!("initialized: {:?}", params)) .log_message(MessageType::INFO, format!("initialized: {params:?}"))
.await; .await;
self.set_is_initialized(true).await; self.set_is_initialized(true).await;
@ -139,7 +139,7 @@ where
self.client() self.client()
.log_message( .log_message(
MessageType::WARNING, MessageType::WARNING,
format!("updating from disk `{}` failed: {:?}", project_dir, err), format!("updating from disk `{project_dir}` failed: {err:?}"),
) )
.await; .await;
} }
@ -148,19 +148,19 @@ where
async fn do_did_change_configuration(&self, params: DidChangeConfigurationParams) { async fn do_did_change_configuration(&self, params: DidChangeConfigurationParams) {
self.client() self.client()
.log_message(MessageType::INFO, format!("configuration changed: {:?}", params)) .log_message(MessageType::INFO, format!("configuration changed: {params:?}"))
.await; .await;
} }
async fn do_did_change_watched_files(&self, params: DidChangeWatchedFilesParams) { async fn do_did_change_watched_files(&self, params: DidChangeWatchedFilesParams) {
self.client() self.client()
.log_message(MessageType::INFO, format!("watched files changed: {:?}", params)) .log_message(MessageType::INFO, format!("watched files changed: {params:?}"))
.await; .await;
} }
async fn do_did_create_files(&self, params: CreateFilesParams) { async fn do_did_create_files(&self, params: CreateFilesParams) {
self.client() self.client()
.log_message(MessageType::INFO, format!("files created: {:?}", params)) .log_message(MessageType::INFO, format!("files created: {params:?}"))
.await; .await;
// Create each file in the code map. // Create each file in the code map.
for file in params.files { for file in params.files {
@ -170,7 +170,7 @@ where
async fn do_did_rename_files(&self, params: RenameFilesParams) { async fn do_did_rename_files(&self, params: RenameFilesParams) {
self.client() self.client()
.log_message(MessageType::INFO, format!("files renamed: {:?}", params)) .log_message(MessageType::INFO, format!("files renamed: {params:?}"))
.await; .await;
// Rename each file in the code map. // Rename each file in the code map.
for file in params.files { for file in params.files {
@ -186,7 +186,7 @@ where
async fn do_did_delete_files(&self, params: DeleteFilesParams) { async fn do_did_delete_files(&self, params: DeleteFilesParams) {
self.client() self.client()
.log_message(MessageType::INFO, format!("files deleted: {:?}", params)) .log_message(MessageType::INFO, format!("files deleted: {params:?}"))
.await; .await;
// Delete each file in the map. // Delete each file in the map.
for file in params.files { for file in params.files {
@ -228,7 +228,7 @@ where
async fn do_did_close(&self, params: DidCloseTextDocumentParams) { async fn do_did_close(&self, params: DidCloseTextDocumentParams) {
self.client() self.client()
.log_message(MessageType::INFO, format!("document closed: {:?}", params)) .log_message(MessageType::INFO, format!("document closed: {params:?}"))
.await; .await;
} }
} }

View File

@ -13,6 +13,7 @@ use std::{
use dashmap::DashMap; use dashmap::DashMap;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use tower_lsp::{ use tower_lsp::{
LanguageServer,
jsonrpc::{Error, Result}, jsonrpc::{Error, Result},
lsp_types::{ lsp_types::{
CreateFilesParams, DeleteFilesParams, Diagnostic, DidChangeConfigurationParams, DidChangeTextDocumentParams, CreateFilesParams, DeleteFilesParams, Diagnostic, DidChangeConfigurationParams, DidChangeTextDocumentParams,
@ -22,7 +23,6 @@ use tower_lsp::{
TextDocumentSyncKind, TextDocumentSyncOptions, WorkspaceFolder, WorkspaceFoldersServerCapabilities, TextDocumentSyncKind, TextDocumentSyncOptions, WorkspaceFolder, WorkspaceFoldersServerCapabilities,
WorkspaceServerCapabilities, WorkspaceServerCapabilities,
}, },
LanguageServer,
}; };
use crate::lsp::{ use crate::lsp::{
@ -198,7 +198,7 @@ impl Backend {
.map_err(|err| Error { .map_err(|err| Error {
code: tower_lsp::jsonrpc::ErrorCode::from(69), code: tower_lsp::jsonrpc::ErrorCode::from(69),
data: None, data: None,
message: Cow::from(format!("Failed to get completions from zoo api: {}", err)), message: Cow::from(format!("Failed to get completions from zoo api: {err}")),
})?; })?;
Ok(resp.completions) Ok(resp.completions)
} }
@ -209,7 +209,7 @@ impl Backend {
let mut lock = copy.write().map_err(|err| Error { let mut lock = copy.write().map_err(|err| Error {
code: tower_lsp::jsonrpc::ErrorCode::from(69), code: tower_lsp::jsonrpc::ErrorCode::from(69),
data: None, data: None,
message: Cow::from(format!("Failed lock: {}", err)), message: Cow::from(format!("Failed lock: {err}")),
})?; })?;
*lock = params; *lock = params;
Ok(Success::new(true)) Ok(Success::new(true))
@ -254,7 +254,7 @@ impl Backend {
.map_err(|err| Error { .map_err(|err| Error {
code: tower_lsp::jsonrpc::ErrorCode::from(69), code: tower_lsp::jsonrpc::ErrorCode::from(69),
data: None, data: None,
message: Cow::from(format!("Failed to get completions: {}", err)), message: Cow::from(format!("Failed to get completions: {err}")),
})?; })?;
#[cfg(not(test))] #[cfg(not(test))]
let mut completion_list = vec![]; let mut completion_list = vec![];
@ -294,7 +294,7 @@ part001 = cube(pos = [0,0], scale = 20)
pub async fn accept_completion(&self, params: CopilotAcceptCompletionParams) { pub async fn accept_completion(&self, params: CopilotAcceptCompletionParams) {
self.client self.client
.log_message(MessageType::INFO, format!("Accepted completions: {:?}", params)) .log_message(MessageType::INFO, format!("Accepted completions: {params:?}"))
.await; .await;
// Get the original telemetry data. // Get the original telemetry data.
@ -303,7 +303,7 @@ part001 = cube(pos = [0,0], scale = 20)
}; };
self.client self.client
.log_message(MessageType::INFO, format!("Original telemetry: {:?}", original)) .log_message(MessageType::INFO, format!("Original telemetry: {original:?}"))
.await; .await;
// TODO: Send the telemetry data to the zoo api. // TODO: Send the telemetry data to the zoo api.
@ -311,7 +311,7 @@ part001 = cube(pos = [0,0], scale = 20)
pub async fn reject_completions(&self, params: CopilotRejectCompletionParams) { pub async fn reject_completions(&self, params: CopilotRejectCompletionParams) {
self.client self.client
.log_message(MessageType::INFO, format!("Rejected completions: {:?}", params)) .log_message(MessageType::INFO, format!("Rejected completions: {params:?}"))
.await; .await;
// Get the original telemetry data. // Get the original telemetry data.
@ -323,7 +323,7 @@ part001 = cube(pos = [0,0], scale = 20)
} }
self.client self.client
.log_message(MessageType::INFO, format!("Original telemetry: {:?}", originals)) .log_message(MessageType::INFO, format!("Original telemetry: {originals:?}"))
.await; .await;
// TODO: Send the telemetry data to the zoo api. // TODO: Send the telemetry data to the zoo api.

View File

@ -85,7 +85,7 @@ impl CopilotCompletionResponse {
impl CopilotCyclingCompletion { impl CopilotCyclingCompletion {
pub fn new(text: String, line_before: String, position: CopilotPosition) -> Self { pub fn new(text: String, line_before: String, position: CopilotPosition) -> Self {
let display_text = text.clone(); let display_text = text.clone();
let text = format!("{}{}", line_before, text); let text = format!("{line_before}{text}");
let end_char = text.find('\n').unwrap_or(text.len()) as u32; let end_char = text.find('\n').unwrap_or(text.len()) as u32;
Self { Self {
uuid: uuid::Uuid::new_v4(), uuid: uuid::Uuid::new_v4(),

View File

@ -3,7 +3,7 @@ use std::collections::HashMap;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use tower_lsp::lsp_types::Range as LspRange; use tower_lsp::lsp_types::Range as LspRange;
use crate::{parsing::ast::types::*, SourceRange}; use crate::{SourceRange, parsing::ast::types::*};
/// Describes information about a hover. /// Describes information about a hover.
#[derive(Debug, Clone, Deserialize, Serialize, PartialEq)] #[derive(Debug, Clone, Deserialize, Serialize, PartialEq)]

View File

@ -15,6 +15,7 @@ use dashmap::DashMap;
use sha2::Digest; use sha2::Digest;
use tokio::sync::RwLock; use tokio::sync::RwLock;
use tower_lsp::{ use tower_lsp::{
Client, LanguageServer,
jsonrpc::Result as RpcResult, jsonrpc::Result as RpcResult,
lsp_types::{ lsp_types::{
CodeAction, CodeActionKind, CodeActionOptions, CodeActionOrCommand, CodeActionParams, CodeAction, CodeActionKind, CodeActionOptions, CodeActionOrCommand, CodeActionParams,
@ -37,10 +38,10 @@ use tower_lsp::{
TextDocumentSyncCapability, TextDocumentSyncKind, TextDocumentSyncOptions, TextEdit, WorkDoneProgressOptions, TextDocumentSyncCapability, TextDocumentSyncKind, TextDocumentSyncOptions, TextEdit, WorkDoneProgressOptions,
WorkspaceEdit, WorkspaceFolder, WorkspaceFoldersServerCapabilities, WorkspaceServerCapabilities, WorkspaceEdit, WorkspaceFolder, WorkspaceFoldersServerCapabilities, WorkspaceServerCapabilities,
}, },
Client, LanguageServer,
}; };
use crate::{ use crate::{
ModuleId, Program, SourceRange,
docs::kcl_doc::ModData, docs::kcl_doc::ModData,
errors::LspSuggestion, errors::LspSuggestion,
exec::KclValue, exec::KclValue,
@ -51,11 +52,10 @@ use crate::{
util::IntoDiagnostic, util::IntoDiagnostic,
}, },
parsing::{ parsing::{
PIPE_OPERATOR,
ast::types::{Expr, VariableKind}, ast::types::{Expr, VariableKind},
token::TokenStream, token::TokenStream,
PIPE_OPERATOR,
}, },
ModuleId, Program, SourceRange,
}; };
pub mod custom_notifications; pub mod custom_notifications;
@ -290,10 +290,9 @@ impl crate::lsp::backend::Backend for Backend {
}; };
// Get the previous tokens. // Get the previous tokens.
let tokens_changed = if let Some(previous_tokens) = self.token_map.get(&filename) { let tokens_changed = match self.token_map.get(&filename) {
*previous_tokens != tokens Some(previous_tokens) => *previous_tokens != tokens,
} else { _ => true,
true
}; };
let had_diagnostics = self.has_diagnostics(params.uri.as_ref()).await; let had_diagnostics = self.has_diagnostics(params.uri.as_ref()).await;
@ -424,7 +423,7 @@ impl Backend {
self.client self.client
.log_message( .log_message(
MessageType::ERROR, MessageType::ERROR,
format!("token type `{:?}` not accounted for", token_type), format!("token type `{token_type:?}` not accounted for"),
) )
.await; .await;
continue; continue;
@ -436,119 +435,121 @@ impl Backend {
// Calculate the token modifiers. // Calculate the token modifiers.
// Get the value at the current position. // Get the value at the current position.
let token_modifiers_bitset = if let Some(ast) = self.ast_map.get(params.uri.as_str()) { let token_modifiers_bitset = match self.ast_map.get(params.uri.as_str()) {
let token_index = Arc::new(Mutex::new(token_type_index)); Some(ast) => {
let modifier_index: Arc<Mutex<u32>> = Arc::new(Mutex::new(0)); let token_index = Arc::new(Mutex::new(token_type_index));
crate::walk::walk(&ast.ast, |node: crate::walk::Node| { let modifier_index: Arc<Mutex<u32>> = Arc::new(Mutex::new(0));
let Ok(node_range): Result<SourceRange, _> = (&node).try_into() else { crate::walk::walk(&ast.ast, |node: crate::walk::Node| {
return Ok(true); let Ok(node_range): Result<SourceRange, _> = (&node).try_into() else {
};
if !node_range.contains(source_range.start()) {
return Ok(true);
}
let get_modifier = |modifier: Vec<SemanticTokenModifier>| -> Result<bool> {
let mut mods = modifier_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
let Some(token_modifier_index) = self.get_semantic_token_modifier_index(modifier) else {
return Ok(true); return Ok(true);
}; };
if *mods == 0 {
*mods = token_modifier_index;
} else {
*mods |= token_modifier_index;
}
Ok(false)
};
match node { if !node_range.contains(source_range.start()) {
crate::walk::Node::TagDeclarator(_) => { return Ok(true);
return get_modifier(vec![
SemanticTokenModifier::DEFINITION,
SemanticTokenModifier::STATIC,
]);
} }
crate::walk::Node::VariableDeclarator(variable) => {
let sr: SourceRange = (&variable.id).into(); let get_modifier = |modifier: Vec<SemanticTokenModifier>| -> Result<bool> {
if sr.contains(source_range.start()) { let mut mods = modifier_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
if let Expr::FunctionExpression(_) = &variable.init { let Some(token_modifier_index) = self.get_semantic_token_modifier_index(modifier) else {
return Ok(true);
};
if *mods == 0 {
*mods = token_modifier_index;
} else {
*mods |= token_modifier_index;
}
Ok(false)
};
match node {
crate::walk::Node::TagDeclarator(_) => {
return get_modifier(vec![
SemanticTokenModifier::DEFINITION,
SemanticTokenModifier::STATIC,
]);
}
crate::walk::Node::VariableDeclarator(variable) => {
let sr: SourceRange = (&variable.id).into();
if sr.contains(source_range.start()) {
if let Expr::FunctionExpression(_) = &variable.init {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::FUNCTION) {
Some(index) => index,
None => token_type_index,
};
}
return get_modifier(vec![
SemanticTokenModifier::DECLARATION,
SemanticTokenModifier::READONLY,
]);
}
}
crate::walk::Node::Parameter(_) => {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::PARAMETER) {
Some(index) => index,
None => token_type_index,
};
return Ok(false);
}
crate::walk::Node::MemberExpression(member_expression) => {
let sr: SourceRange = (&member_expression.property).into();
if sr.contains(source_range.start()) {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::PROPERTY) {
Some(index) => index,
None => token_type_index,
};
return Ok(false);
}
}
crate::walk::Node::ObjectProperty(object_property) => {
let sr: SourceRange = (&object_property.key).into();
if sr.contains(source_range.start()) {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::PROPERTY) {
Some(index) => index,
None => token_type_index,
};
}
return get_modifier(vec![SemanticTokenModifier::DECLARATION]);
}
crate::walk::Node::CallExpressionKw(call_expr) => {
let sr: SourceRange = (&call_expr.callee).into();
if sr.contains(source_range.start()) {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?; let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::FUNCTION) { *ti = match self.get_semantic_token_type_index(&SemanticTokenType::FUNCTION) {
Some(index) => index, Some(index) => index,
None => token_type_index, None => token_type_index,
}; };
if self.stdlib_completions.contains_key(&call_expr.callee.name.name) {
// This is a stdlib function.
return get_modifier(vec![SemanticTokenModifier::DEFAULT_LIBRARY]);
}
return Ok(false);
} }
}
_ => {}
}
Ok(true)
})
.unwrap_or_default();
return get_modifier(vec![ let t = match token_index.lock() {
SemanticTokenModifier::DECLARATION, Ok(guard) => *guard,
SemanticTokenModifier::READONLY, _ => 0,
]); };
} token_type_index = t;
}
crate::walk::Node::Parameter(_) => {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::PARAMETER) {
Some(index) => index,
None => token_type_index,
};
return Ok(false);
}
crate::walk::Node::MemberExpression(member_expression) => {
let sr: SourceRange = (&member_expression.property).into();
if sr.contains(source_range.start()) {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::PROPERTY) {
Some(index) => index,
None => token_type_index,
};
return Ok(false);
}
}
crate::walk::Node::ObjectProperty(object_property) => {
let sr: SourceRange = (&object_property.key).into();
if sr.contains(source_range.start()) {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::PROPERTY) {
Some(index) => index,
None => token_type_index,
};
}
return get_modifier(vec![SemanticTokenModifier::DECLARATION]);
}
crate::walk::Node::CallExpressionKw(call_expr) => {
let sr: SourceRange = (&call_expr.callee).into();
if sr.contains(source_range.start()) {
let mut ti = token_index.lock().map_err(|_| anyhow::anyhow!("mutex"))?;
*ti = match self.get_semantic_token_type_index(&SemanticTokenType::FUNCTION) {
Some(index) => index,
None => token_type_index,
};
if self.stdlib_completions.contains_key(&call_expr.callee.name.name) { match modifier_index.lock() {
// This is a stdlib function. Ok(guard) => *guard,
return get_modifier(vec![SemanticTokenModifier::DEFAULT_LIBRARY]); _ => 0,
}
return Ok(false);
}
}
_ => {}
} }
Ok(true) }
}) _ => 0,
.unwrap_or_default();
let t = if let Ok(guard) = token_index.lock() { *guard } else { 0 };
token_type_index = t;
let m = if let Ok(guard) = modifier_index.lock() {
*guard
} else {
0
};
m
} else {
0
}; };
// We need to check if we are on the last token of the line. // We need to check if we are on the last token of the line.
@ -652,11 +653,14 @@ impl Backend {
.await; .await;
} }
let mut items = if let Some(items) = self.diagnostics_map.get(params.uri.as_str()) { let mut items = match self.diagnostics_map.get(params.uri.as_str()) {
// TODO: Would be awesome to fix the clone here. Some(items) => {
items.clone() // TODO: Would be awesome to fix the clone here.
} else { items.clone()
vec![] }
_ => {
vec![]
}
}; };
for diagnostic in diagnostics { for diagnostic in diagnostics {
@ -768,7 +772,7 @@ impl Backend {
// Read hash digest and consume hasher // Read hash digest and consume hasher
let result = hasher.finalize(); let result = hasher.finalize();
// Get the hash as a string. // Get the hash as a string.
let user_id_hash = format!("{:x}", result); let user_id_hash = format!("{result:x}");
// Get the workspace folders. // Get the workspace folders.
// The key of the workspace folder is the project name. // The key of the workspace folder is the project name.
@ -866,7 +870,7 @@ impl Backend {
impl LanguageServer for Backend { impl LanguageServer for Backend {
async fn initialize(&self, params: InitializeParams) -> RpcResult<InitializeResult> { async fn initialize(&self, params: InitializeParams) -> RpcResult<InitializeResult> {
self.client self.client
.log_message(MessageType::INFO, format!("initialize: {:?}", params)) .log_message(MessageType::INFO, format!("initialize: {params:?}"))
.await; .await;
Ok(InitializeResult { Ok(InitializeResult {
@ -1006,7 +1010,7 @@ impl LanguageServer for Backend {
#[cfg(not(target_arch = "wasm32"))] #[cfg(not(target_arch = "wasm32"))]
if let Err(err) = self.send_telemetry().await { if let Err(err) = self.send_telemetry().await {
self.client self.client
.log_message(MessageType::WARNING, format!("failed to send telemetry: {}", err)) .log_message(MessageType::WARNING, format!("failed to send telemetry: {err}"))
.await; .await;
} }
} }
@ -1090,7 +1094,7 @@ impl LanguageServer for Backend {
Ok(Some(LspHover { Ok(Some(LspHover {
contents: HoverContents::Markup(MarkupContent { contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown, kind: MarkupKind::Markdown,
value: format!("```\n{}{}\n```\n\n{}", name, sig, docs), value: format!("```\n{name}{sig}\n```\n\n{docs}"),
}), }),
range: Some(range), range: Some(range),
})) }))
@ -1118,7 +1122,7 @@ impl LanguageServer for Backend {
Ok(Some(LspHover { Ok(Some(LspHover {
contents: HoverContents::Markup(MarkupContent { contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown, kind: MarkupKind::Markdown,
value: format!("```\n{}\n```\n\n{}", name, docs), value: format!("```\n{name}\n```\n\n{docs}"),
}), }),
range: Some(range), range: Some(range),
})) }))
@ -1153,17 +1157,17 @@ impl LanguageServer for Backend {
} => Ok(Some(LspHover { } => Ok(Some(LspHover {
contents: HoverContents::Markup(MarkupContent { contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown, kind: MarkupKind::Markdown,
value: format!("```\n{}: {}\n```", name, ty), value: format!("```\n{name}: {ty}\n```"),
}), }),
range: Some(range), range: Some(range),
})), })),
Hover::Variable { name, ty: None, range } => Ok(with_cached_var(&name, |value| { Hover::Variable { name, ty: None, range } => Ok(with_cached_var(&name, |value| {
let mut text: String = format!("```\n{}", name); let mut text: String = format!("```\n{name}");
if let Some(ty) = value.principal_type() { if let Some(ty) = value.principal_type() {
text.push_str(&format!(": {}", ty.human_friendly_type())); text.push_str(&format!(": {}", ty.human_friendly_type()));
} }
if let Some(v) = value.value_str() { if let Some(v) = value.value_str() {
text.push_str(&format!(" = {}", v)); text.push_str(&format!(" = {v}"));
} }
text.push_str("\n```"); text.push_str("\n```");

View File

@ -13,8 +13,8 @@ use tower_lsp::lsp_types::{Diagnostic, DiagnosticSeverity, DiagnosticTag};
pub use util::IntoDiagnostic; pub use util::IntoDiagnostic;
use crate::{ use crate::{
errors::{Severity, Tag},
CompilationError, CompilationError,
errors::{Severity, Tag},
}; };
impl IntoDiagnostic for CompilationError { impl IntoDiagnostic for CompilationError {

View File

@ -2,18 +2,18 @@ use std::collections::{BTreeMap, HashMap};
use pretty_assertions::assert_eq; use pretty_assertions::assert_eq;
use tower_lsp::{ use tower_lsp::{
LanguageServer,
lsp_types::{ lsp_types::{
CodeActionKind, CodeActionOrCommand, Diagnostic, PrepareRenameResponse, SemanticTokenModifier, CodeActionKind, CodeActionOrCommand, Diagnostic, PrepareRenameResponse, SemanticTokenModifier,
SemanticTokenType, TextEdit, WorkspaceEdit, SemanticTokenType, TextEdit, WorkspaceEdit,
}, },
LanguageServer,
}; };
use crate::{ use crate::{
SourceRange,
errors::{LspSuggestion, Suggestion}, errors::{LspSuggestion, Suggestion},
lsp::test_util::{copilot_lsp_server, kcl_lsp_server}, lsp::test_util::{copilot_lsp_server, kcl_lsp_server},
parsing::ast::types::{Node, Program}, parsing::ast::types::{Node, Program},
SourceRange,
}; };
#[track_caller] #[track_caller]
@ -276,11 +276,7 @@ async fn test_updating_kcl_lsp_files() {
assert_eq!(server.code_map.len(), 11); assert_eq!(server.code_map.len(), 11);
// Just make sure that one of the current files read from disk is accurate. // Just make sure that one of the current files read from disk is accurate.
assert_eq!( assert_eq!(
server server.code_map.get(&format!("{string_path}/util.rs")).unwrap().clone(),
.code_map
.get(&format!("{}/util.rs", string_path))
.unwrap()
.clone(),
include_str!("util.rs").as_bytes() include_str!("util.rs").as_bytes()
); );
} }
@ -633,7 +629,7 @@ async fn test_kcl_lsp_create_zip() {
} }
assert_eq!(files.len(), 12); assert_eq!(files.len(), 12);
let util_path = format!("{}/util.rs", string_path).replace("file://", ""); let util_path = format!("{string_path}/util.rs").replace("file://", "");
assert!(files.contains_key(&util_path)); assert!(files.contains_key(&util_path));
assert_eq!(files.get("/test.kcl"), Some(&4)); assert_eq!(files.get("/test.kcl"), Some(&4));
} }
@ -2359,7 +2355,7 @@ async fn test_kcl_lsp_diagnostic_has_lints() {
assert_eq!(diagnostics.full_document_diagnostic_report.items.len(), 1); assert_eq!(diagnostics.full_document_diagnostic_report.items.len(), 1);
assert_eq!( assert_eq!(
diagnostics.full_document_diagnostic_report.items[0].message, diagnostics.full_document_diagnostic_report.items[0].message,
"Identifiers must be lowerCamelCase" "Identifiers should be lowerCamelCase"
); );
} else { } else {
panic!("Expected full diagnostics"); panic!("Expected full diagnostics");

View File

@ -7,7 +7,7 @@ use serde::{Deserialize, Serialize};
use crate::{ use crate::{
errors::{KclError, KclErrorDetails}, errors::{KclError, KclErrorDetails},
exec::KclValue, exec::KclValue,
execution::{typed_path::TypedPath, EnvironmentRef, ModuleArtifactState, PreImportedGeometry}, execution::{EnvironmentRef, ModuleArtifactState, PreImportedGeometry, typed_path::TypedPath},
fs::{FileManager, FileSystem}, fs::{FileManager, FileSystem},
parsing::ast::types::{ImportPath, Node, Program}, parsing::ast::types::{ImportPath, Node, Program},
source_range::SourceRange, source_range::SourceRange,
@ -73,13 +73,13 @@ impl ModuleLoader {
} }
pub(crate) fn enter_module(&mut self, path: &ModulePath) { pub(crate) fn enter_module(&mut self, path: &ModulePath) {
if let ModulePath::Local { value: ref path } = path { if let ModulePath::Local { value: path } = path {
self.import_stack.push(path.clone()); self.import_stack.push(path.clone());
} }
} }
pub(crate) fn leave_module(&mut self, path: &ModulePath) { pub(crate) fn leave_module(&mut self, path: &ModulePath) {
if let ModulePath::Local { value: ref path } = path { if let ModulePath::Local { value: path } = path {
let popped = self.import_stack.pop().unwrap(); let popped = self.import_stack.pop().unwrap();
assert_eq!(path, &popped); assert_eq!(path, &popped);
} }

View File

@ -2,8 +2,8 @@ pub(crate) mod digest;
pub mod types; pub mod types;
use crate::{ use crate::{
parsing::ast::types::{BinaryPart, BodyItem, Expr, LiteralIdentifier},
ModuleId, ModuleId,
parsing::ast::types::{BinaryPart, BodyItem, Expr, LiteralIdentifier},
}; };
impl BodyItem { impl BodyItem {

View File

@ -25,15 +25,14 @@ pub use crate::parsing::ast::types::{
none::KclNone, none::KclNone,
}; };
use crate::{ use crate::{
ModuleId, TypedPath,
errors::KclError, errors::KclError,
execution::{ execution::{
annotations, KclValue, Metadata, TagIdentifier, annotations,
types::{ArrayLen, UnitAngle, UnitLen}, types::{ArrayLen, UnitAngle, UnitLen},
KclValue, Metadata, TagIdentifier,
}, },
parsing::{ast::digest::Digest, token::NumericSuffix, PIPE_OPERATOR}, parsing::{PIPE_OPERATOR, ast::digest::Digest, token::NumericSuffix},
source_range::SourceRange, source_range::SourceRange,
ModuleId, TypedPath,
}; };
mod condition; mod condition;
@ -72,18 +71,18 @@ impl<T: JsonSchema> schemars::JsonSchema for Node<T> {
T::schema_name() T::schema_name()
} }
fn json_schema(gen: &mut schemars::gen::SchemaGenerator) -> schemars::schema::Schema { fn json_schema(r#gen: &mut schemars::r#gen::SchemaGenerator) -> schemars::schema::Schema {
let mut child = T::json_schema(gen).into_object(); let mut child = T::json_schema(r#gen).into_object();
// We want to add the start and end fields to the schema. // We want to add the start and end fields to the schema.
// Ideally we would add _any_ extra fields from the Node type automatically // Ideally we would add _any_ extra fields from the Node type automatically
// but this is a bit hard since this isn't a macro. // but this is a bit hard since this isn't a macro.
let Some(ref mut object) = &mut child.object else { let Some(object) = &mut child.object else {
// This should never happen. But it will panic at compile time of docs if it does. // This should never happen. But it will panic at compile time of docs if it does.
// Which is better than runtime. // Which is better than runtime.
panic!("Expected object schema for {}", T::schema_name()); panic!("Expected object schema for {}", T::schema_name());
}; };
object.properties.insert("start".to_string(), usize::json_schema(gen)); object.properties.insert("start".to_string(), usize::json_schema(r#gen));
object.properties.insert("end".to_string(), usize::json_schema(gen)); object.properties.insert("end".to_string(), usize::json_schema(r#gen));
schemars::schema::Schema::Object(child.clone()) schemars::schema::Schema::Object(child.clone())
} }
@ -681,7 +680,7 @@ impl Program {
break; break;
} }
} }
BodyItem::VariableDeclaration(ref mut variable_declaration) => { BodyItem::VariableDeclaration(variable_declaration) => {
if let Some(var_old_name) = variable_declaration.rename_symbol(new_name, pos) { if let Some(var_old_name) = variable_declaration.rename_symbol(new_name, pos) {
old_name = Some(var_old_name); old_name = Some(var_old_name);
break; break;
@ -705,18 +704,16 @@ impl Program {
// Recurse over the item. // Recurse over the item.
let mut value = match item { let mut value = match item {
BodyItem::ImportStatement(_) => None, // TODO BodyItem::ImportStatement(_) => None, // TODO
BodyItem::ExpressionStatement(ref mut expression_statement) => { BodyItem::ExpressionStatement(expression_statement) => Some(&mut expression_statement.expression),
Some(&mut expression_statement.expression) BodyItem::VariableDeclaration(variable_declaration) => {
}
BodyItem::VariableDeclaration(ref mut variable_declaration) => {
variable_declaration.get_mut_expr_for_position(pos) variable_declaration.get_mut_expr_for_position(pos)
} }
BodyItem::TypeDeclaration(_) => None, BodyItem::TypeDeclaration(_) => None,
BodyItem::ReturnStatement(ref mut return_statement) => Some(&mut return_statement.argument), BodyItem::ReturnStatement(return_statement) => Some(&mut return_statement.argument),
}; };
// Check if we have a function expression. // Check if we have a function expression.
if let Some(Expr::FunctionExpression(ref mut function_expression)) = &mut value { if let Some(Expr::FunctionExpression(function_expression)) = &mut value {
// Check if the params to the function expression contain the position. // Check if the params to the function expression contain the position.
for param in &mut function_expression.params { for param in &mut function_expression.params {
let param_source_range: SourceRange = (&param.identifier).into(); let param_source_range: SourceRange = (&param.identifier).into();
@ -764,7 +761,7 @@ impl Program {
BodyItem::ExpressionStatement(_) => { BodyItem::ExpressionStatement(_) => {
continue; continue;
} }
BodyItem::VariableDeclaration(ref mut variable_declaration) => { BodyItem::VariableDeclaration(variable_declaration) => {
if variable_declaration.declaration.id.name == name { if variable_declaration.declaration.id.name == name {
variable_declaration.declaration = declarator; variable_declaration.declaration = declarator;
return; return;
@ -783,14 +780,14 @@ impl Program {
for item in &mut self.body { for item in &mut self.body {
match item { match item {
BodyItem::ImportStatement(_) => {} // TODO BodyItem::ImportStatement(_) => {} // TODO
BodyItem::ExpressionStatement(ref mut expression_statement) => expression_statement BodyItem::ExpressionStatement(expression_statement) => expression_statement
.expression .expression
.replace_value(source_range, new_value.clone()), .replace_value(source_range, new_value.clone()),
BodyItem::VariableDeclaration(ref mut variable_declaration) => { BodyItem::VariableDeclaration(variable_declaration) => {
variable_declaration.replace_value(source_range, new_value.clone()) variable_declaration.replace_value(source_range, new_value.clone())
} }
BodyItem::TypeDeclaration(_) => {} BodyItem::TypeDeclaration(_) => {}
BodyItem::ReturnStatement(ref mut return_statement) => { BodyItem::ReturnStatement(return_statement) => {
return_statement.argument.replace_value(source_range, new_value.clone()) return_statement.argument.replace_value(source_range, new_value.clone())
} }
} }
@ -1040,18 +1037,18 @@ impl Expr {
} }
match self { match self {
Expr::BinaryExpression(ref mut bin_exp) => bin_exp.replace_value(source_range, new_value), Expr::BinaryExpression(bin_exp) => bin_exp.replace_value(source_range, new_value),
Expr::ArrayExpression(ref mut array_exp) => array_exp.replace_value(source_range, new_value), Expr::ArrayExpression(array_exp) => array_exp.replace_value(source_range, new_value),
Expr::ArrayRangeExpression(ref mut array_range) => array_range.replace_value(source_range, new_value), Expr::ArrayRangeExpression(array_range) => array_range.replace_value(source_range, new_value),
Expr::ObjectExpression(ref mut obj_exp) => obj_exp.replace_value(source_range, new_value), Expr::ObjectExpression(obj_exp) => obj_exp.replace_value(source_range, new_value),
Expr::MemberExpression(_) => {} Expr::MemberExpression(_) => {}
Expr::Literal(_) => {} Expr::Literal(_) => {}
Expr::FunctionExpression(ref mut func_exp) => func_exp.replace_value(source_range, new_value), Expr::FunctionExpression(func_exp) => func_exp.replace_value(source_range, new_value),
Expr::CallExpressionKw(ref mut call_exp) => call_exp.replace_value(source_range, new_value), Expr::CallExpressionKw(call_exp) => call_exp.replace_value(source_range, new_value),
Expr::Name(_) => {} Expr::Name(_) => {}
Expr::TagDeclarator(_) => {} Expr::TagDeclarator(_) => {}
Expr::PipeExpression(ref mut pipe_exp) => pipe_exp.replace_value(source_range, new_value), Expr::PipeExpression(pipe_exp) => pipe_exp.replace_value(source_range, new_value),
Expr::UnaryExpression(ref mut unary_exp) => unary_exp.replace_value(source_range, new_value), Expr::UnaryExpression(unary_exp) => unary_exp.replace_value(source_range, new_value),
Expr::IfExpression(_) => {} Expr::IfExpression(_) => {}
Expr::PipeSubstitution(_) => {} Expr::PipeSubstitution(_) => {}
Expr::LabelledExpression(expr) => expr.expr.replace_value(source_range, new_value), Expr::LabelledExpression(expr) => expr.expr.replace_value(source_range, new_value),
@ -1113,25 +1110,19 @@ impl Expr {
fn rename_identifiers(&mut self, old_name: &str, new_name: &str) { fn rename_identifiers(&mut self, old_name: &str, new_name: &str) {
match self { match self {
Expr::Literal(_literal) => {} Expr::Literal(_literal) => {}
Expr::Name(ref mut identifier) => identifier.rename(old_name, new_name), Expr::Name(identifier) => identifier.rename(old_name, new_name),
Expr::TagDeclarator(ref mut tag) => tag.rename(old_name, new_name), Expr::TagDeclarator(tag) => tag.rename(old_name, new_name),
Expr::BinaryExpression(ref mut binary_expression) => { Expr::BinaryExpression(binary_expression) => binary_expression.rename_identifiers(old_name, new_name),
binary_expression.rename_identifiers(old_name, new_name)
}
Expr::FunctionExpression(_function_identifier) => {} Expr::FunctionExpression(_function_identifier) => {}
Expr::CallExpressionKw(ref mut call_expression) => call_expression.rename_identifiers(old_name, new_name), Expr::CallExpressionKw(call_expression) => call_expression.rename_identifiers(old_name, new_name),
Expr::PipeExpression(ref mut pipe_expression) => pipe_expression.rename_identifiers(old_name, new_name), Expr::PipeExpression(pipe_expression) => pipe_expression.rename_identifiers(old_name, new_name),
Expr::PipeSubstitution(_) => {} Expr::PipeSubstitution(_) => {}
Expr::ArrayExpression(ref mut array_expression) => array_expression.rename_identifiers(old_name, new_name), Expr::ArrayExpression(array_expression) => array_expression.rename_identifiers(old_name, new_name),
Expr::ArrayRangeExpression(ref mut array_range) => array_range.rename_identifiers(old_name, new_name), Expr::ArrayRangeExpression(array_range) => array_range.rename_identifiers(old_name, new_name),
Expr::ObjectExpression(ref mut object_expression) => { Expr::ObjectExpression(object_expression) => object_expression.rename_identifiers(old_name, new_name),
object_expression.rename_identifiers(old_name, new_name) Expr::MemberExpression(member_expression) => member_expression.rename_identifiers(old_name, new_name),
} Expr::UnaryExpression(unary_expression) => unary_expression.rename_identifiers(old_name, new_name),
Expr::MemberExpression(ref mut member_expression) => { Expr::IfExpression(expr) => expr.rename_identifiers(old_name, new_name),
member_expression.rename_identifiers(old_name, new_name)
}
Expr::UnaryExpression(ref mut unary_expression) => unary_expression.rename_identifiers(old_name, new_name),
Expr::IfExpression(ref mut expr) => expr.rename_identifiers(old_name, new_name),
Expr::LabelledExpression(expr) => expr.expr.rename_identifiers(old_name, new_name), Expr::LabelledExpression(expr) => expr.expr.rename_identifiers(old_name, new_name),
Expr::AscribedExpression(expr) => expr.expr.rename_identifiers(old_name, new_name), Expr::AscribedExpression(expr) => expr.expr.rename_identifiers(old_name, new_name),
Expr::None(_) => {} Expr::None(_) => {}
@ -1325,15 +1316,9 @@ impl BinaryPart {
match self { match self {
BinaryPart::Literal(_) => {} BinaryPart::Literal(_) => {}
BinaryPart::Name(_) => {} BinaryPart::Name(_) => {}
BinaryPart::BinaryExpression(ref mut binary_expression) => { BinaryPart::BinaryExpression(binary_expression) => binary_expression.replace_value(source_range, new_value),
binary_expression.replace_value(source_range, new_value) BinaryPart::CallExpressionKw(call_expression) => call_expression.replace_value(source_range, new_value),
} BinaryPart::UnaryExpression(unary_expression) => unary_expression.replace_value(source_range, new_value),
BinaryPart::CallExpressionKw(ref mut call_expression) => {
call_expression.replace_value(source_range, new_value)
}
BinaryPart::UnaryExpression(ref mut unary_expression) => {
unary_expression.replace_value(source_range, new_value)
}
BinaryPart::MemberExpression(_) => {} BinaryPart::MemberExpression(_) => {}
BinaryPart::IfExpression(e) => e.replace_value(source_range, new_value), BinaryPart::IfExpression(e) => e.replace_value(source_range, new_value),
BinaryPart::AscribedExpression(e) => e.expr.replace_value(source_range, new_value), BinaryPart::AscribedExpression(e) => e.expr.replace_value(source_range, new_value),
@ -1370,21 +1355,13 @@ impl BinaryPart {
fn rename_identifiers(&mut self, old_name: &str, new_name: &str) { fn rename_identifiers(&mut self, old_name: &str, new_name: &str) {
match self { match self {
BinaryPart::Literal(_literal) => {} BinaryPart::Literal(_literal) => {}
BinaryPart::Name(ref mut identifier) => identifier.rename(old_name, new_name), BinaryPart::Name(identifier) => identifier.rename(old_name, new_name),
BinaryPart::BinaryExpression(ref mut binary_expression) => { BinaryPart::BinaryExpression(binary_expression) => binary_expression.rename_identifiers(old_name, new_name),
binary_expression.rename_identifiers(old_name, new_name) BinaryPart::CallExpressionKw(call_expression) => call_expression.rename_identifiers(old_name, new_name),
} BinaryPart::UnaryExpression(unary_expression) => unary_expression.rename_identifiers(old_name, new_name),
BinaryPart::CallExpressionKw(ref mut call_expression) => { BinaryPart::MemberExpression(member_expression) => member_expression.rename_identifiers(old_name, new_name),
call_expression.rename_identifiers(old_name, new_name) BinaryPart::IfExpression(if_expression) => if_expression.rename_identifiers(old_name, new_name),
} BinaryPart::AscribedExpression(e) => e.expr.rename_identifiers(old_name, new_name),
BinaryPart::UnaryExpression(ref mut unary_expression) => {
unary_expression.rename_identifiers(old_name, new_name)
}
BinaryPart::MemberExpression(ref mut member_expression) => {
member_expression.rename_identifiers(old_name, new_name)
}
BinaryPart::IfExpression(ref mut if_expression) => if_expression.rename_identifiers(old_name, new_name),
BinaryPart::AscribedExpression(ref mut e) => e.expr.rename_identifiers(old_name, new_name),
} }
} }
} }
@ -2824,7 +2801,7 @@ impl MemberExpression {
self.object.rename_identifiers(old_name, new_name); self.object.rename_identifiers(old_name, new_name);
match &mut self.property { match &mut self.property {
LiteralIdentifier::Identifier(ref mut identifier) => identifier.rename(old_name, new_name), LiteralIdentifier::Identifier(identifier) => identifier.rename(old_name, new_name),
LiteralIdentifier::Literal(_) => {} LiteralIdentifier::Literal(_) => {}
} }
} }
@ -3312,7 +3289,7 @@ impl Type {
.map(|t| t.human_friendly_type()) .map(|t| t.human_friendly_type())
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(" or "), .join(" or "),
Type::Object { .. } => format!("an object with fields `{}`", self), Type::Object { .. } => format!("an object with fields `{self}`"),
} }
} }
@ -3469,7 +3446,11 @@ pub struct RequiredParamAfterOptionalParam(pub Box<Parameter>);
impl std::fmt::Display for RequiredParamAfterOptionalParam { impl std::fmt::Display for RequiredParamAfterOptionalParam {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "KCL functions must declare any optional parameters after all the required parameters. But your required parameter {} is _after_ an optional parameter. You must move it to before the optional parameters instead.", self.0.identifier.name) write!(
f,
"KCL functions must declare any optional parameters after all the required parameters. But your required parameter {} is _after_ an optional parameter. You must move it to before the optional parameters instead.",
self.0.identifier.name
)
} }
} }

View File

@ -3,8 +3,8 @@
use super::CompilationError; use super::CompilationError;
use crate::{ use crate::{
parsing::ast::types::{BinaryExpression, BinaryOperator, BinaryPart, Node},
SourceRange, SourceRange,
parsing::ast::types::{BinaryExpression, BinaryOperator, BinaryPart, Node},
}; };
/// Parses a list of tokens (in infix order, i.e. as the user typed them) /// Parses a list of tokens (in infix order, i.e. as the user typed them)
@ -127,11 +127,11 @@ impl From<BinaryOperator> for BinaryExpressionToken {
mod tests { mod tests {
use super::*; use super::*;
use crate::{ use crate::{
ModuleId,
parsing::{ parsing::{
ast::types::{Literal, LiteralValue}, ast::types::{Literal, LiteralValue},
token::NumericSuffix, token::NumericSuffix,
}, },
ModuleId,
}; };
#[test] #[test]

View File

@ -1,11 +1,11 @@
use crate::{ use crate::{
ModuleId,
errors::{CompilationError, KclError, KclErrorDetails}, errors::{CompilationError, KclError, KclErrorDetails},
parsing::{ parsing::{
ast::types::{Node, Program}, ast::types::{Node, Program},
token::TokenStream, token::TokenStream,
}, },
source_range::SourceRange, source_range::SourceRange,
ModuleId,
}; };
pub(crate) mod ast; pub(crate) mod ast;
@ -18,7 +18,7 @@ pub const PIPE_OPERATOR: &str = "|>";
// `?` like behavior for `Result`s to return a ParseResult if there is an error. // `?` like behavior for `Result`s to return a ParseResult if there is an error.
macro_rules! pr_try { macro_rules! pr_try {
($e: expr) => { ($e: expr_2021) => {
match $e { match $e {
Ok(a) => a, Ok(a) => a,
Err(e) => return e.into(), Err(e) => return e.into(),
@ -187,7 +187,7 @@ pub fn deprecation(s: &str, kind: DeprecationKind) -> Option<&'static str> {
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
macro_rules! parse_and_lex { macro_rules! parse_and_lex {
($func_name:ident, $test_kcl_program:expr) => { ($func_name:ident, $test_kcl_program:expr_2021) => {
#[test] #[test]
fn $func_name() { fn $func_name() {
let _ = crate::parsing::top_level_parse($test_kcl_program); let _ = crate::parsing::top_level_parse($test_kcl_program);

View File

@ -14,14 +14,16 @@ use winnow::{
}; };
use super::{ use super::{
DeprecationKind,
ast::types::{AscribedExpression, ImportPath, LabelledExpression}, ast::types::{AscribedExpression, ImportPath, LabelledExpression},
token::{NumericSuffix, RESERVED_WORDS}, token::{NumericSuffix, RESERVED_WORDS},
DeprecationKind,
}; };
use crate::{ use crate::{
IMPORT_FILE_EXTENSIONS, SourceRange, TypedPath,
errors::{CompilationError, Severity, Tag}, errors::{CompilationError, Severity, Tag},
execution::types::ArrayLen, execution::types::ArrayLen,
parsing::{ parsing::{
PIPE_OPERATOR, PIPE_SUBSTITUTION_OPERATOR,
ast::types::{ ast::types::{
Annotation, ArrayExpression, ArrayRangeExpression, BinaryExpression, BinaryOperator, BinaryPart, BodyItem, Annotation, ArrayExpression, ArrayRangeExpression, BinaryExpression, BinaryOperator, BinaryPart, BodyItem,
BoxNode, CallExpressionKw, CommentStyle, DefaultParamVal, ElseIf, Expr, ExpressionStatement, BoxNode, CallExpressionKw, CommentStyle, DefaultParamVal, ElseIf, Expr, ExpressionStatement,
@ -33,9 +35,7 @@ use crate::{
}, },
math::BinaryExpressionToken, math::BinaryExpressionToken,
token::{Token, TokenSlice, TokenType}, token::{Token, TokenSlice, TokenType},
PIPE_OPERATOR, PIPE_SUBSTITUTION_OPERATOR,
}, },
SourceRange, TypedPath, IMPORT_FILE_EXTENSIONS,
}; };
thread_local! { thread_local! {
@ -602,7 +602,7 @@ fn binary_operator(i: &mut TokenSlice) -> ModalResult<BinaryOperator> {
return Err(CompilationError::fatal( return Err(CompilationError::fatal(
token.as_source_range(), token.as_source_range(),
format!("{} is not a binary operator", token.value.as_str()), format!("{} is not a binary operator", token.value.as_str()),
)) ));
} }
}; };
Ok(op) Ok(op)
@ -726,7 +726,7 @@ fn shebang(i: &mut TokenSlice) -> ModalResult<Node<Shebang>> {
opt(whitespace).parse_next(i)?; opt(whitespace).parse_next(i)?;
Ok(Node::new( Ok(Node::new(
Shebang::new(format!("#!{}", value)), Shebang::new(format!("#!{value}")),
0, 0,
tokens.last().unwrap().end, tokens.last().unwrap().end,
tokens.first().unwrap().module_id, tokens.first().unwrap().module_id,
@ -1926,7 +1926,7 @@ fn validate_path_string(path_string: String, var_name: bool, path_range: SourceR
return Err(ErrMode::Cut( return Err(ErrMode::Cut(
CompilationError::fatal( CompilationError::fatal(
path_range, path_range,
format!("Invalid import path for import from std: {}.", path_string), format!("Invalid import path for import from std: {path_string}."),
) )
.into(), .into(),
)); ));
@ -1938,7 +1938,10 @@ fn validate_path_string(path_string: String, var_name: bool, path_range: SourceR
if !IMPORT_FILE_EXTENSIONS.contains(&extn.to_string_lossy().to_string()) { if !IMPORT_FILE_EXTENSIONS.contains(&extn.to_string_lossy().to_string()) {
ParseContext::warn(CompilationError::err( ParseContext::warn(CompilationError::err(
path_range, path_range,
format!("unsupported import path format. KCL files can be imported from the current project, CAD files with the following formats are supported: {}", IMPORT_FILE_EXTENSIONS.join(", ")), format!(
"unsupported import path format. KCL files can be imported from the current project, CAD files with the following formats are supported: {}",
IMPORT_FILE_EXTENSIONS.join(", ")
),
)) ))
} }
ImportPath::Foreign { ImportPath::Foreign {
@ -2210,7 +2213,7 @@ fn declaration(i: &mut TokenSlice) -> ModalResult<BoxNode<VariableDeclaration>>
if matches!(val, Expr::FunctionExpression(_)) { if matches!(val, Expr::FunctionExpression(_)) {
return Err(CompilationError::fatal( return Err(CompilationError::fatal(
SourceRange::new(start, dec_end, id.module_id), SourceRange::new(start, dec_end, id.module_id),
format!("Expected a `fn` variable kind, found: `{}`", kind), format!("Expected a `fn` variable kind, found: `{kind}`"),
)); ));
} }
Ok(val) Ok(val)
@ -3312,10 +3315,10 @@ fn fn_call_kw(i: &mut TokenSlice) -> ModalResult<Node<CallExpressionKw>> {
ParseContext::warn( ParseContext::warn(
CompilationError::err( CompilationError::err(
result.as_source_range(), result.as_source_range(),
format!("Calling `{}` is deprecated, prefer using `{}`.", callee_str, suggestion), format!("Calling `{callee_str}` is deprecated, prefer using `{suggestion}`."),
) )
.with_suggestion( .with_suggestion(
format!("Replace `{}` with `{}`", callee_str, suggestion), format!("Replace `{callee_str}` with `{suggestion}`"),
suggestion, suggestion,
None, None,
Tag::Deprecated, Tag::Deprecated,
@ -3333,13 +3336,13 @@ mod tests {
use super::*; use super::*;
use crate::{ use crate::{
parsing::ast::types::{BodyItem, Expr, VariableKind},
ModuleId, ModuleId,
parsing::ast::types::{BodyItem, Expr, VariableKind},
}; };
fn assert_reserved(word: &str) { fn assert_reserved(word: &str) {
// Try to use it as a variable name. // Try to use it as a variable name.
let code = format!(r#"{} = 0"#, word); let code = format!(r#"{word} = 0"#);
let result = crate::parsing::top_level_parse(code.as_str()); let result = crate::parsing::top_level_parse(code.as_str());
let err = &result.unwrap_errs().next().unwrap(); let err = &result.unwrap_errs().next().unwrap();
// Which token causes the error may change. In "return = 0", for // Which token causes the error may change. In "return = 0", for
@ -5263,7 +5266,7 @@ mod snapshot_math_tests {
// The macro takes a KCL program, ensures it tokenizes and parses, then compares // The macro takes a KCL program, ensures it tokenizes and parses, then compares
// its parsed AST to a snapshot (kept in this repo in a file under snapshots/ dir) // its parsed AST to a snapshot (kept in this repo in a file under snapshots/ dir)
macro_rules! snapshot_test { macro_rules! snapshot_test {
($func_name:ident, $test_kcl_program:expr) => { ($func_name:ident, $test_kcl_program:expr_2021) => {
#[test] #[test]
fn $func_name() { fn $func_name() {
let module_id = crate::ModuleId::default(); let module_id = crate::ModuleId::default();
@ -5301,7 +5304,7 @@ mod snapshot_tests {
// The macro takes a KCL program, ensures it tokenizes and parses, then compares // The macro takes a KCL program, ensures it tokenizes and parses, then compares
// its parsed AST to a snapshot (kept in this repo in a file under snapshots/ dir) // its parsed AST to a snapshot (kept in this repo in a file under snapshots/ dir)
macro_rules! snapshot_test { macro_rules! snapshot_test {
($func_name:ident, $test_kcl_program:expr) => { ($func_name:ident, $test_kcl_program:expr_2021) => {
#[test] #[test]
fn $func_name() { fn $func_name() {
let module_id = crate::ModuleId::default(); let module_id = crate::ModuleId::default();

View File

@ -16,10 +16,10 @@ use winnow::{
}; };
use crate::{ use crate::{
CompilationError, ModuleId,
errors::KclError, errors::KclError,
parsing::ast::types::{ItemVisibility, VariableKind}, parsing::ast::types::{ItemVisibility, VariableKind},
source_range::SourceRange, source_range::SourceRange,
CompilationError, ModuleId,
}; };
mod tokeniser; mod tokeniser;
@ -609,7 +609,7 @@ impl From<ParseError<Input<'_>, winnow::error::ContextError>> for KclError {
// TODO: Add the Winnow parser context to the error. // TODO: Add the Winnow parser context to the error.
// See https://github.com/KittyCAD/modeling-app/issues/784 // See https://github.com/KittyCAD/modeling-app/issues/784
KclError::new_lexical(crate::errors::KclErrorDetails::new( KclError::new_lexical(crate::errors::KclErrorDetails::new(
format!("found unknown token '{}'", bad_token), format!("found unknown token '{bad_token}'"),
vec![SourceRange::new(offset, offset + 1, module_id)], vec![SourceRange::new(offset, offset + 1, module_id)],
)) ))
} }

View File

@ -1,19 +1,19 @@
use fnv::FnvHashMap; use fnv::FnvHashMap;
use lazy_static::lazy_static; use lazy_static::lazy_static;
use winnow::{ use winnow::{
LocatingSlice, Stateful,
ascii::{digit1, multispace1}, ascii::{digit1, multispace1},
combinator::{alt, opt, peek, preceded, repeat}, combinator::{alt, opt, peek, preceded, repeat},
error::{ContextError, ParseError}, error::{ContextError, ParseError},
prelude::*, prelude::*,
stream::{Location, Stream}, stream::{Location, Stream},
token::{any, none_of, take_till, take_until, take_while}, token::{any, none_of, take_till, take_until, take_while},
LocatingSlice, Stateful,
}; };
use super::TokenStream; use super::TokenStream;
use crate::{ use crate::{
parsing::token::{Token, TokenType},
ModuleId, ModuleId,
parsing::token::{Token, TokenType},
}; };
lazy_static! { lazy_static! {

Some files were not shown because too many files have changed in this diff Show More